亚洲av成人无遮挡网站在线观看,少妇性bbb搡bbb爽爽爽,亚洲av日韩精品久久久久久,兔费看少妇性l交大片免费,无码少妇一区二区三区

  免費(fèi)注冊(cè) 查看新帖 |

Chinaunix

  平臺(tái) 論壇 博客 文庫(kù)
最近訪問(wèn)板塊 發(fā)新帖
查看: 6403 | 回復(fù): 0
打印 上一主題 下一主題

[Spark] spark run-example跑完后沒(méi)結(jié)果 [復(fù)制鏈接]

論壇徽章:
0
跳轉(zhuǎn)到指定樓層
1 [收藏(0)] [報(bào)告]
發(fā)表于 2015-06-18 17:06 |只看該作者 |倒序?yàn)g覽
執(zhí)行完/usr/local/spark/bin/run-example org.apache.spark.examples.SparkPi local 沒(méi)有返回結(jié)果

>$ /usr/local/spark/bin/run-example org.apache.spark.examples.SparkPi local
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/06/18 16:39:04 INFO SparkContext: Running Spark version 1.4.0
15/06/18 16:39:05 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/06/18 16:39:05 INFO SecurityManager: Changing view acls to: hadoop
15/06/18 16:39:05 INFO SecurityManager: Changing modify acls to: hadoop
15/06/18 16:39:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
15/06/18 16:39:05 INFO Slf4jLogger: Slf4jLogger started
15/06/18 16:39:05 INFO Remoting: Starting remoting
15/06/18 16:39:06 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.7.244:36288]
15/06/18 16:39:06 INFO Utils: Successfully started service 'sparkDriver' on port 36288.
15/06/18 16:39:06 INFO SparkEnv: Registering MapOutputTracker
15/06/18 16:39:06 INFO SparkEnv: Registering BlockManagerMaster
15/06/18 16:39:06 INFO DiskBlockManager: Created local directory at /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3/blockmgr-851c4a8c-8f89-4c22-b81f-9fe095e941d8
15/06/18 16:39:06 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
15/06/18 16:39:06 INFO HttpFileServer: HTTP File server directory is /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3/httpd-073fa9e8-2646-4fc9-a8d7-9dcb1511c029
15/06/18 16:39:06 INFO HttpServer: Starting HTTP Server
15/06/18 16:39:06 INFO Utils: Successfully started service 'HTTP file server' on port 35185.
15/06/18 16:39:06 INFO SparkEnv: Registering OutputCommitCoordinator
15/06/18 16:39:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/06/18 16:39:06 INFO SparkUI: Started SparkUI at http://192.168.7.244:4040
15/06/18 16:39:06 INFO SparkContext: Added JAR file:/usr/local/spark/lib/spark-examples-1.4.0-hadoop2.6.0.jar at http://192.168.7.244:35185/jars/ ... 4.0-hadoop2.6.0.jar with timestamp 1434616746886
15/06/18 16:39:06 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@192.168.7.244:7077/user/Master...
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150618163907-0005
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/0 on worker-20150618145843-192.168.7.247-54347 (192.168.7.247:54347) with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/0 on hostPort 192.168.7.247:54347 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/1 on worker-20150618145843-192.168.7.237-55205 (192.168.7.237:55205) with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/1 on hostPort 192.168.7.237:55205 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/2 on worker-20150618145840-192.168.7.246-40048 (192.168.7.246:4004 with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/2 on hostPort 192.168.7.246:40048 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor added: app-20150618163907-0005/3 on worker-20150618145840-192.168.7.232-52496 (192.168.7.232:52496) with 2 cores
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150618163907-0005/3 on hostPort 192.168.7.232:52496 with 2 cores, 512.0 MB RAM
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/2 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/3 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/0 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/1 is now LOADING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/0 is now RUNNING
5/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/1 is now RUNNING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/2 is now RUNNING
15/06/18 16:39:07 INFO AppClient$ClientActor: Executor updated: app-20150618163907-0005/3 is now RUNNING
15/06/18 16:39:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44921.
15/06/18 16:39:07 INFO NettyBlockTransferService: Server created on 44921
15/06/18 16:39:07 INFO BlockManagerMaster: Trying to register BlockManager
15/06/18 16:39:07 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.7.244:44921 with 265.1 MB RAM, BlockManagerId(driver, 192.168.7.244, 44921)
15/06/18 16:39:07 INFO BlockManagerMaster: Registered BlockManager
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NumberFormatException: For input string: "local"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
        at java.lang.Integer.parseInt(Integer.java:580)
        at java.lang.Integer.parseInt(Integer.java:615)
        at scala.collection.immutable.StringLike$class.toInt(StringLike.scala:229)
        at scala.collection.immutable.StringOps.toInt(StringOps.scala:31)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:29)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/18 16:39:07 INFO SparkContext: Invoking stop() from shutdown hook
15/06/18 16:39:07 INFO SparkUI: Stopped Spark web UI at http://192.168.7.244:4040
15/06/18 16:39:07 INFO DAGScheduler: Stopping DAGScheduler
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Shutting down all executors
15/06/18 16:39:07 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
15/06/18 16:39:07 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/06/18 16:39:07 INFO Utils: path = /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3/blockmgr-851c4a8c-8f89-4c22-b81f-9fe095e941d8, already present as root for deletion.
15/06/18 16:39:07 INFO MemoryStore: MemoryStore cleared
15/06/18 16:39:07 INFO BlockManager: BlockManager stopped
15/06/18 16:39:07 INFO BlockManagerMaster: BlockManagerMaster stopped
15/06/18 16:39:07 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/06/18 16:39:07 INFO SparkContext: Successfully stopped SparkContext
15/06/18 16:39:07 INFO Utils: Shutdown hook called
15/06/18 16:39:07 INFO Utils: Deleting directory /tmp/spark-c7749c63-d53e-4bd8-a377-ddadbfdeeed3

不是最后應(yīng)該返回Pi is roughly 3.1444 么
才開(kāi)始弄spark 不清楚這是什么情況
您需要登錄后才可以回帖 登錄 | 注冊(cè)

本版積分規(guī)則 發(fā)表回復(fù)

  

北京盛拓優(yōu)訊信息技術(shù)有限公司. 版權(quán)所有 京ICP備16024965號(hào)-6 北京市公安局海淀分局網(wǎng)監(jiān)中心備案編號(hào):11010802020122 niuxiaotong@pcpop.com 17352615567
未成年舉報(bào)專區(qū)
中國(guó)互聯(lián)網(wǎng)協(xié)會(huì)會(huì)員  聯(lián)系我們:huangweiwei@itpub.net
感謝所有關(guān)心和支持過(guò)ChinaUnix的朋友們 轉(zhuǎn)載本站內(nèi)容請(qǐng)注明原作者名及出處

清除 Cookies - ChinaUnix - Archiver - WAP - TOP