Creates, updates, deletes, gets or lists a records
resource.
Overview
Name | records |
Type | Resource |
Id | confluent.kafka.records |
Fields
SELECT
not supported for this resource, use SHOW METHODS
to view available operations for the resource.
Methods
Name | Accessible by | Required Params | Description |
---|
produce_record | EXEC | cluster_id, topic_name | Produce records to the given topic, returning delivery reports for each record produced. This API can be used in streaming mode by setting "Transfer-Encoding: chunked" header. For as long as the connection is kept open, the server will keep accepting records. Records are streamed to and from the server as Concatenated JSON. For each record sent to the server, the server will asynchronously send back a delivery report, in the same order, each with its own error_code. An error_code of 200 indicates success. The HTTP status code will be HTTP 200 OK as long as the connection is successfully established. To identify records that have encountered an error, check the error_code of each delivery report. Note that the cluster_id is validated only when running in Confluent Cloud. This API currently does not support Schema Registry integration. Sending schemas is not supported. Only BINARY, JSON, and STRING formats are supported. |