Does the vertica offer a scheduler that transfer data from vertica to kafka???
Not around KafkaExport; you will need to schedule your exports outside of Vertica (much like you'd schedule exports to parquet, S3....).
Note that for Data Collector information, a notifier can be defined to automatically export it to Kafka at it gets generated: see: https://www.vertica.com/docs/10.1.x/HTML/Content/Authoring/KafkaIntegrationGuide/Notifiers/AutomaticallySendingNotificationsofDCTableChanges.htm
then if i use notifer i could send conditional data from a specific table to Kafka via KafkaExport every 5 minutes.?
With Notifier, Vertica could send Data Collector information as soon as it gets generated (almost like a trigger). Note that this is possible only with Data Collector tables This blog might be helpful: https://www.vertica.com/blog/publish-data-collector-tables-apache-kafka/
Umm, I read the link recommended by you thank you
then, Is the Vertica cannot monitor and push general table's data to Kafka via Notifer(ex. custom table) ??
If this is not possible, how can i solve this problem?
One of the options is to use a cron job as a time-based job scheduler which send conditional data from a specific table to Kafka via KafkaExport.https://www.vertica.com/docs/10.1.x/HTML/Content/Authoring/KafkaIntegrationGuide/KafkaFunctions/KafkaExport.htm
then, do I need to registy KafkaExport query to the cron job??
Using the DBadmin user Vertica admintools already uses the Linux cron package to schedule jobs that regularly rotate the DB logs.
See crontab -l output:
# Vertica administrator cron
# Minute Hour Day Month Day of Week Command
5 3 * * * /opt/vertica/oss/python3/bin/python3 -m vertica.do_logrotate &> /dev/null
So use a user with the right grants to run your script.
Can't find what you're looking for? Search the Vertica Documentation, Knowledge Base, or Blog for more information.