关于spark:SparkScala-Learning

Problem 1:
Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not “opens java.nio” to unnamed module
Solution:
Apache Spark is NOT COMPATIBLE with Java 16.
When downloading a JDK for Spark, Java 11 is the safest choice.
I am using Java 16, so need to shift JDK to 11.
Download JDK11 installer from Oracle. Install.
File – Project Structure – Platform Settings – + – add JDK11
File – Project Structure – Project – SDK – choose JDK11
Rebuild the project. Run.

评论

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注

这个站点使用 Akismet 来减少垃圾评论。了解你的评论数据如何被处理