create routine load test.test1 on test1
columns (app_id,app_name,description,cookie_domain,cookie_path,index_url,verify_name, database,temp,__op=(CASE temp WHEN “DELETE” THEN 1 ELSE 0 END) )
PROPERTIES (
“format”=“json”,
“jsonpaths” = “[”$.data[0].app_id","$.data[0].app_name","$.data[0].description","$.data[0].cookie_domain","$.data[0].cookie_path","$.data[0].index_url","$.data[0].index_url","$.data[0].verify_name" ,"$.database","$.type" ]",
“desired_concurrent_number”=“1”,
“max_error_number”=“0”
)
FROM KAFKA (
“kafka_broker_list”= “10.0.10.12:9092”,
“kafka_topic”=“test_test1”
);
当时不支持data中是一个array的数据格式
现在不支持data中是一个array格式
发下routine load配置
是的,当前不支持json path和json root同时指定,咱们可以使用flink-cdc的方案吗?可以读取kafka中canal的数据然后同步到starrocks
可参考https://forum.starrocks.com/t/topic/1815