Reading multiple json files from Spark -


i have list of json files load in parallel.

i can't use read.json("*") cause files not in same folder , there no specific pattern can implement.

i've tried sc.parallelize(filelist).select(hivecontext.read.json) hive context, expected, doesn't exists in executor.

any ideas?

looks found solution:

val text sc.textfile("file1,file2....") val df = sqlcontext.read.json(text) 

Comments

Popular posts from this blog

javascript - Feed FileReader from server side files -

c++ - What's the differece between of link to a dynamic file and as a input object? -

Android Unit Testing / Mockito: android.location.Location not mocked -