Prepare Data for ML APIs on Google Cloud: Challenge Lab Reviews

Prepare Data for ML APIs on Google Cloud: Challenge Lab Reviews

164277 reviews

Luiz G. · Reviewed حوالي 7 ساعات ago

Alberth A. · Reviewed حوالي 7 ساعات ago

Eduardo A. · Reviewed حوالي 7 ساعات ago

Clara S. · Reviewed حوالي 7 ساعات ago

Hemanth K. · Reviewed حوالي 7 ساعات ago

Alana G. · Reviewed حوالي 7 ساعات ago

Celso C. · Reviewed حوالي 7 ساعات ago

Gustavo F. · Reviewed حوالي 7 ساعات ago

Kaylane B. · Reviewed حوالي 7 ساعات ago

rebeca r. · Reviewed حوالي 7 ساعات ago

KRISHNA G. · Reviewed حوالي 8 ساعات ago

DUBBUDU T. · Reviewed حوالي 8 ساعات ago

Ci L. · Reviewed حوالي 8 ساعات ago

Joao A. · Reviewed حوالي 8 ساعات ago

JITTA A. · Reviewed حوالي 8 ساعات ago

Jagadeesh D. · Reviewed حوالي 9 ساعات ago

Garyee T. · Reviewed حوالي 9 ساعات ago

confusing

Irene L. · Reviewed حوالي 9 ساعات ago

Michael O. · Reviewed حوالي 9 ساعات ago

Franco C. · Reviewed حوالي 9 ساعات ago

André d. · Reviewed حوالي 10 ساعات ago

Unable to finish. Dataproc job gave the following error: Caused by: java.lang.NoClassDefFoundError: scala/math/Ordering at org.apache.spark.examples.SparkPageRank.main(SparkPageRank.scala)

John H. · Reviewed حوالي 10 ساعات ago

Muhammad Zahran A. · Reviewed حوالي 10 ساعات ago

Try 3 time to create the jobs always fail... WARN: This is a naive implementation of PageRank and is given as an example! Please use the PageRank implementation found in org.apache.spark.graphx.lib.PageRank for more conventional use. 24/06/10 14:24:21 INFO SparkEnv: Registering MapOutputTracker 24/06/10 14:24:22 INFO SparkEnv: Registering BlockManagerMaster 24/06/10 14:24:22 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 24/06/10 14:24:22 INFO SparkEnv: Registering OutputCommitCoordinator 24/06/10 14:24:23 INFO DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at cluster-e74d-m.us-central1-b.c.qwiklabs-gcp-03-347f94788cfb.internal./10.128.0.10:8032 24/06/10 14:24:24 INFO AHSProxy: Connecting to Application History server at cluster-e74d-m.us-central1-b.c.qwiklabs-gcp-03-347f94788cfb.internal./10.128.0.10:10200 24/06/10 14:24:25 INFO Configuration: resource-types.xml not found 24/06/10 14:24:25 INFO ResourceUtils: Unable to find 'resource-types.xml'. 24/06/10 14:24:27 INFO YarnClientImpl: Submitted application application_1718029314860_0001 24/06/10 14:24:29 INFO DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at cluster-e74d-m.us-central1-b.c.qwiklabs-gcp-03-347f94788cfb.internal./10.128.0.10:8030 24/06/10 14:24:33 INFO MetricsConfig: Loaded properties from hadoop-metrics2.properties 24/06/10 14:24:33 INFO MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 24/06/10 14:24:33 INFO MetricsSystemImpl: google-hadoop-file-system metrics system started 24/06/10 14:24:34 INFO GoogleCloudStorageImpl: Ignoring exception of type GoogleJsonResponseException; verified object already exists with desired state. 24/06/10 14:24:36 INFO GoogleHadoopOutputStream: hflush(): No-op due to rate limit (RateLimiter[stableRate=0.2qps]): readers will *not* yet see flushed data for gs://dataproc-temp-us-central1-216016258895-fbueqx14/75101875-af16-49fd-8747-a9f5efd70ab8/spark-job-history/application_1718029314860_0001.inprogress [CONTEXT ratelimit_period="1 MINUTES" ] Exception in thread "main" org.apache.spark.sql.AnalysisException: [PATH_NOT_FOUND] Path does not exist: hdfs://cluster-e74d-m/data.txt. at org.apache.spark.sql.errors.QueryCompilationErrors$.dataPathNotExistError(QueryCompilationErrors.scala:1500) at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$checkAndGlobPathIfNecessary$4(DataSource.scala:757) at org.apache.spark.sql.execution.datasources.DataSource$.$anonfun$checkAndGlobPathIfNecessary$4$adapted(DataSource.scala:754) at org.apache.spark.util.ThreadUtils$.$anonfun$parmap$2(ThreadUtils.scala:380) at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) at scala.util.Success.$anonfun$map$1(Try.scala:255) at scala.util.Success.map(Try.scala:213) at scala.concurrent.Future.$anonfun$map$1(Future.scala:292) at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) at java.base/java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1426) at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290) at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020) at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656) at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594) at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)

Kabuqueci S. · Reviewed حوالي 10 ساعات ago

very hard lab

Werner S. · Reviewed حوالي 11 ساعة ago

We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.