conf = SparkConf().setAppName("Simple App").setMaster("local")sc = SparkContext(conf=conf)file = "./README.md"“”“ 111 aaa bcd 22 qqq www”“”dataFile = sc.textFile(file)test = dataFile.map(lambda s : s)print test.collect() # [u'111 aaa bcd', u'22 qqq www'] test = dataFile.flatMap(lambda s : s) print test.collect() # [u'1', u'1', u'1', u' ', u'a', u'a', u'a', u' ', u'b', u'c', u'd', u'2', u'2', u' ', u'q', u'q', u'q', u' ', u'w', u'w', u'w'map文檔是這樣解釋的:Return a new distributed dataset formed by passing each element of the source through a function func.我的理解是對(duì)rdd的每一個(gè)element進(jìn)行func,最后返回的數(shù)量應(yīng)該是等于element的數(shù)量的。flatMap是這樣解釋:Similar to map, but each input item can be mapped to 0 or more output items (so func should return a Seq rather than a single item).這里不懂為什么flatMap結(jié)果是一個(gè)個(gè)字母?
2 回答

米脂
TA貢獻(xiàn)1836條經(jīng)驗(yàn) 獲得超3個(gè)贊
map操作我記得的有map(一條對(duì)一條),mapToPair(map成鍵值對(duì)),flatMap(一條記錄變n條(n>=0))。你可以看看官方的wordCount demo,對(duì)flatMap的功能解釋得很詳盡了
- 2 回答
- 0 關(guān)注
- 932 瀏覽
添加回答
舉報(bào)
0/150
提交
取消