Data Pipeline/ETL Tools

Source, Sink별 Connector 정리

재심 2022. 10. 29. 21:03

개인적으로 생각하는 Source, Sink별 사용할 만한 Connector를 정리해봄.

Source Sink 용도 Tool 비고 참조
Kafka  Kafka 기존 토픽 메시지 가공 후 새로운 토픽 생성 Ksql, Flink 스트림 처리  
Kafka Elasticsearch 데이터 이관 logstash   kafka input: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html
output: https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html
Kafka Mongodb 데이터 이관 logstash mongodb input은 없음. kafka input: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html
mongodb output: https://www.elastic.co/guide/en/logstash/current/plugins-outputs-mongodb.html
Kafka Redis 데이터 이관 logstash   kafka input: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html
redis output: https://www.elastic.co/guide/en/logstash/current/plugins-outputs-redis.html
Kafka RDB (JDBC) 데이터 이관 Logstash, Flink, Kafka Connector   Flink Kafka Source, Sink 지원
Flink JDBC Source, Sink 지원 
Kafka JDBC Connector 커뮤니티 라이센스 
https://www.confluent.io/hub/confluentinc/kafka-connect-jdbc
RDB (JDBC) Elasticsearch 데이터 이관 logstash   jdbc input: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html
es output: https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html