Event Subscription Guide
This document outlines how customers can seamlessly integrate their systems (e.g., SAP) to subscribe to real-time events from NewRosetta without manual webhook setup. Using Confluent Kafka Connectors, NewRosetta publishes events to Kafka topics, enabling customers to connect via their preferred pipeline tools.
Overview
NewRosetta uses Kafka Topics to broadcast events in real-time. By leveraging Confluent Connectors, customers can easily set up event subscriptions without dealing with manual webhook configurations. This approach ensures:
- Reliability: Guaranteed delivery of events.
- Scalability: Handles high event volumes effortlessly.
- Flexibility: Integrates seamlessly with existing systems like SAP, Oracle, and others.
Integration Steps
Follow these steps to connect your system to NewRosetta events:
Step 1: Verify Access Requirements
- Ensure your organization has access to a Confluent Kafka cluster.
- Verify that the relevant Confluent Connector for your system (e.g., SAP, Oracle, Snowflake) is available. Refer to the Confluent Connectors Library for details.
- Obtain the topic name and authentication credentials (if required) from your NewRosetta contact.
Step 2: Choose Your Connector
Select the appropriate Confluent Connector based on your target system. Common examples include:
System | Connector |
---|---|
SAP | SAP ERP Connector |
Snowflake | Snowflake Sink Connector |
SQL Database | JDBC Sink Connector |
Elasticsearch | Elasticsearch Sink Connector |
Custom System | Kafka Source or Sink Connectors |
For a full list of supported connectors, visit the Confluent Marketplace.
Step 3: Configure the Connector
- Login to Confluent Control Center: Use your Confluent Cloud account.
- Set Up the Connector:
- Navigate to Connectors and select your desired connector.
- Provide the following details:
- Kafka Topic: Obtain from your NewRosetta contact (e.g.,
newrosetta.events
). - Authentication: Configure using the credentials shared by NewRosetta.
- Transformation Rules (Optional): Define filtering, enrichment, or mapping rules if needed.
- Kafka Topic: Obtain from your NewRosetta contact (e.g.,
- Test the Connection:
- Run a test to ensure the connector retrieves events correctly.
- Check logs for errors or warnings.
Step 4: Validate Data Flow
- Confirm that events are being received in your target system.
- Cross-check the data format and structure against the NewRosetta event schema.
- Log sample events for audit or debugging purposes.
Best Practices
- Secure Access: Use encrypted connections (e.g., TLS) to protect data in transit.
- Monitoring: Enable monitoring in Confluent Control Center to track connector performance and error rates.
- Scaling: Configure your connector to handle peak loads without message loss.
- Event Filtering: Use transformation rules to filter irrelevant events or enrich data before processing.
- Retries: Ensure retry logic is in place for transient errors.
Support and Troubleshooting
If you encounter issues during integration:
- Check Logs: Review logs in the Confluent Control Center for errors or warnings.
- Validate Configuration: Ensure Kafka topic names and credentials are accurate.
- Contact Support:
- Reach out to your NewRosetta contact for topic-specific queries.
- For connector-related issues, refer to the Confluent Support Center.
By following this guide, you can efficiently integrate your systems to consume events from NewRosetta without needing complex webhook configurations. For additional guidance, contact your NewRosetta representative or consult the Confluent documentation.
Updated 3 months ago