core

module
v4.12.11 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 24, 2024 License: Apache-2.0

README

Pixlise Core

DOI

Open in Gitpod

build build build

What is it?

PIXLISE Core is the API and data management processes for the PIXLISE platform.

PIXLISE is deployed to https://www.pixlise.org

Building

The core package is written in Golang and contains a number of components required for deplyoment of the PIXLISE platform. The simplest way to build the code is to run

make build

within the project root directory. This will build a number of binary files that are then located in the _out directory. The main API is called pixlise-api-xxx where xxx is the target architecture. By default we build for Mac, Linux and Windows.

Code Generation

  • go install github.com/favadi/protoc-go-inject-tag@latest
  • Run ./genproto.sh

Run-time Configuration

Executing the API requires several environment variables to be set. These include ones related to AWS (see below). A config file is also read. It's path can be specified with a command line argument: customConfigPath. This config specifies buckets and other configuration parameters to allow the API to execute containers and log errors, etc.

To see the configuration JSON structure, look at the APIConfig structure in /api/config/config.go

Docker / Kubernetes

TODO

Required Environment Variables
  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY
  • AWS_REGION=us-west-1

Developing in Gitpod

If you're wondering what the Gitpod button above is and would like to get a development environment up and running easily, visit the documentation here for more info.

Debugging in VS Code

  • Download the source.
  • Add a new configuration to your .vscode/launch.json file with program set to internal/api, which supplies the following to start debugger:
    "env": {
        "AWS_ACCESS_KEY_ID":"<<< LOOK THIS UP! >>>",
        "AWS_SECRET_ACCESS_KEY":"<<< LOOK THIS UP! >>>",
        "AWS_DEFAULT_REGION":"us-east-1",
        "AWS_S3_US_EAST_1_REGIONAL_ENDPOINT":"regional",
        ... Any other env variables as needed
    },
  • Start a local mongo DB in docker: run local-mongo/startdb.sh. On startup the DB is seeded with data from JSON files. This container can be stopped and will be deleted at that point.
  • Hit debug for the config in VS Code

You may encounter errors related to having an old Go version. At time of writing PIXLISE Core requires Go version 1.21. VS Code may also want to install some plugins for Go development.

The API takes a few seconds to start up. Watch the Debug Console in VS Code! You will see:

  • A dump of the configuration the API started with
  • Mongo DB connection status
  • A listing of all API endpoints and what permission they require
  • "INFO: API Started..." signifying the API is ready to accept requests

Local Mongo database access

Download "MongoDB Compass" and when the docker container is running locally (in docker), connect to it with this connection string: mongodb://mongoadmin:secret@localhost:27888/?authMechanism=DEFAULT

Example CLI flags

-quantExecutor docker - this tells the API to use local docker as the quant executor, meaning PIQUANT jobs will start on your local development machine.

Documentation

Given this is written in Go, it supports godoc! Being a public repository, documentation automatically is pulled into the online Go documentation site, but to view documentation locally, you can run godoc -http=:6060 and to export to a zip file you can create a directory and run godoc-static --destination=./doctest ./. That last parameter being the current directory - if it's missed, then all go packages are documented (and somehow the ones in this project are not!)

Directories

Path Synopsis
api
config
API configuration as read from strings/JSON and some constants defined here also
API configuration as read from strings/JSON and some constants defined here also
dataimport
Implements importer triggering based on SNS queues.
Implements importer triggering based on SNS queues.
dataimport/internal/datasetArchive
Implements archiving/retrieval of dataset source zip files as delivered by GDS.
Implements archiving/retrieval of dataset source zip files as delivered by GDS.
dataimport/internal/output
Allows outputting (in PIXLISE protobuf dataset format) of in-memory representation of PIXL data that importer has read.
Allows outputting (in PIXLISE protobuf dataset format) of in-memory representation of PIXL data that importer has read.
filepaths
Defines all paths/file names used in S3 for storage of our data.
Defines all paths/file names used in S3 for storage of our data.
job
permission
Permission constants and helper functions for defining routes.
Permission constants and helper functions for defining routes.
piquant
Storage/versioning and retrieval of PIQUANT configuration files and the currently selected PIQUANT pod version to be run
Storage/versioning and retrieval of PIQUANT configuration files and the currently selected PIQUANT pod version to be run
quantification/quantRunner
Exposes interfaces and structures required to run PIQUANT in the Kubernetes cluster along with functions to access quantification files, logs, results and summaries of quant jobs.
Exposes interfaces and structures required to run PIQUANT in the Kubernetes cluster along with functions to access quantification files, logs, results and summaries of quant jobs.
router
The guts of PIXLISE API endpoint handler/routing code.
The guts of PIXLISE API endpoint handler/routing code.
services
Services used by API endpoint handlers and other bits of code.
Services used by API endpoint handlers and other bits of code.
ws
core
auth0login
Contains all the code needed to do an Auth0 login and retrieve a JWT.
Contains all the code needed to do an Auth0 login and retrieve a JWT.
awsutil
AWS utility functions to wrap some functionality and provide mocking capabilities for unit testing.
AWS utility functions to wrap some functionality and provide mocking capabilities for unit testing.
fileaccess
Provides a higher-level file access interface which is implemented using local file storage as well as AWS S3.
Provides a higher-level file access interface which is implemented using local file storage as well as AWS S3.
gdsfilename
File name parser and writer, allowing us to extract metadata from the strict file name conventions defined by GDS
File name parser and writer, allowing us to extract metadata from the strict file name conventions defined by GDS
kubernetes
Utilities to connect to and command a Kubernetes cluster to start and shut down pods.
Utilities to connect to and command a Kubernetes cluster to start and shut down pods.
logger
A common logging interface used throughout the code which has implementations using stdout and AWS cloudwatch.
A common logging interface used throughout the code which has implementations using stdout and AWS cloudwatch.
mongoDBConnection
Lowest-level code to connect to Mongo DB (locally in Docker and remotely) and get consistant collection names.
Lowest-level code to connect to Mongo DB (locally in Docker and remotely) and get consistant collection names.
timestamper
An interface to get a unix time stamp (in seconds) with an included mock that can be pre-loaded with timestamps for predictable unit test output.
An interface to get a unix time stamp (in seconds) with an included mock that can be pre-loaded with timestamps for predictable unit test output.
utils
Exposes various utility functions for strings, generation of valid filenames and random ID strings, zipping files/directories, reading/writing images
Exposes various utility functions for strings, generation of valid filenames and random ID strings, zipping files/directories, reading/writing images
internal
api

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL