Documentation ¶
Overview ¶
The cache status header will be set to:
- `HIT` when all requests are cache hits
- `MISS` when all requests are cache misses
- `PARTIAL` when there is a mix of cache hits and misses
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func CreateBatchProcessingMiddleware ¶
func CreateBatchProcessingMiddleware( singleRequestHandler http.HandlerFunc, config *BatchMiddlewareConfig, ) http.HandlerFunc
CreateBatchProcessingMiddleware handles batch EVM requests The batched request is pulled from the context. Then, each request is proxied via the singleRequestHandler and the responses are collated into a single result which is served to the client.
Types ¶
type BatchMiddlewareConfig ¶
type BatchMiddlewareConfig struct { ServiceLogger *logging.ServiceLogger ContextKeyDecodedRequestBatch string ContextKeyDecodedRequestSingle string MaximumBatchSize int }
BatchMiddlewareConfig are the necessary configuration options for the Batch Processing Middleware
type BatchProcessor ¶
type BatchProcessor struct { *logging.ServiceLogger // contains filtered or unexported fields }
BatchProcessor makes multiple requests to the underlying handler and then combines all the responses into a single response. It assumes all individual responses are valid json. Each response is then marshaled into an array.
func NewBatchProcessor ¶
func NewBatchProcessor(serviceLogger *logging.ServiceLogger, handler http.HandlerFunc, reqs []*http.Request) *BatchProcessor
NewBatchProcessor creates a BatchProcessor for combining the responses of reqs to the handler
func (*BatchProcessor) RequestAndServe ¶
func (bp *BatchProcessor) RequestAndServe(w http.ResponseWriter) error
RequestAndServe concurrently sends each request to the underlying handler Responses are then collated into a JSON array and written to the ResponseWriter