If transformations are configured, they operate on the change data either during the extract phaseĪt the source node or the load phase at the target node. Special event types like "reload" for Initial Loads are also processed. If large objects are configured for streaming instead of capture, they are queried from the table. ExtractĬhanges are extracted from the runtime tables and prepared to be sent as an outgoing batch. The user configures which routers to use and what criteria is used to match data, creating subsets of rows if needed.Ĭhanges are grouped into batches and assigned to target nodes in the DATA_EVENT and OUTGOING_BATCH tables. Routers run across new changes to determine which target nodes will receive the data. Provided by the user, and it can automatically detect schema changes on tables and regenerate triggers. The subsystem installs and maintains triggers on tables based on the configuration The changes are recorded as insert, update, and delete event types. ![]() For file sync, a similar mechanism is used, except changes to the metadata about files are captured. Using a Confluent AVRO Schema RegistryĬhange Data Capture (CDC) for tables uses database triggers that fire and record changes as comma-separated values intoĪ runtime table called DATA. ![]() Importing Signed Certificates from PKCS 12 files
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |