Your Location is: Home > Scala

Load a file to couchbase using spark

From: Tajikstan View: 4503 Rafa 


I haven't found any solution clear for loading a file e into Couchbase using spark

I am having a file huge file with lot of records similar to this


My code

spark-shell --packages com.couchbase.client:spark-connector_2.11:2.2.0 --conf spark.couchbase.username=username --conf spark.couchbase.password=passrod --conf spark.couchbase.bucket.bucketname="" --conf spark.couchbase.nodes=,,

import com.couchbase.spark._
import com.couchbase.spark.streaming._
import org.apache.spark.sql.{DataFrameReader, SQLContext}
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.{SparkConf, SparkContext}
val df ="delimiter", "|").option("header", true).csv("/hdfsData/test.doc").toDF()
spark.sql("select * from TempQuery").map(pair => { val ID = JsonArray.create()
    val content = JsonObject.create().put("ID", ID)"Content")).foreach(ID.add)
    JsonDocument.create(pair._1, content)

I know this is wrong , but i just started , new to Scala and Couchbase.

Please let me know your inputs, basically i have the key and value in a file separated by | and I wanted to loaded to the Couchbase

Best answer