Using Oracle Data Integrator to generate Spark Applications and run it on OCI Data Flow
Blog: Oracle BPM
Oracle Cloud Infrastructure Data Flow (OCI Data Flow) lets you run Apache Spark jobs at any scale with almost no administration. It is serverless, elastic, and you only pay for the OCPUs you use when a job is running. But more importantly you don’t need to administer, maintain, secure and patch an entire cluster by yourself.
You can write your own Spark Applications in Python, Java or Scala. Or you can use a data integration tool like Oracle Data Integrator to generate the Python code for you.
Leave a Comment
You must be logged in to post a comment.