TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Announcing Hazelcast Jet 0.6 – The 3rd Generation Big Data Processing Engine

23 pointsby Darren1968about 7 years ago

2 comments

gregrluckabout 7 years ago
This is done using the Jet API programmatically from your Java application. Jet is meant to be used operationally with developed and deployed applications.<p>We can use HDFS as a source or a sink. See <a href="https:&#x2F;&#x2F;github.com&#x2F;hazelcast&#x2F;hazelcast-jet-code-samples&#x2F;blob&#x2F;0.6-maintenance&#x2F;batch&#x2F;wordcount-hadoop&#x2F;src&#x2F;main&#x2F;java&#x2F;HadoopWordCount.java" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;hazelcast&#x2F;hazelcast-jet-code-samples&#x2F;blob...</a> for a HDFS Wordcount example.<p>Jet jobs run in an isolated class loader, which is distributed to the cluster when the Job is started. You do this by adding classes&#x2F;jars to JobConfig. See <a href="http:&#x2F;&#x2F;docs.hazelcast.org&#x2F;docs&#x2F;jet&#x2F;0.6&#x2F;manual&#x2F;#practical-" rel="nofollow">http:&#x2F;&#x2F;docs.hazelcast.org&#x2F;docs&#x2F;jet&#x2F;0.6&#x2F;manual&#x2F;#practical-</a> considerations for details.
dxxviabout 7 years ago
If I already have a Hadoop cluster, which can run a spark job in a jar file on HDFS with spark-submit, how can I install Hazelcast Jet so that I can do the same as with Spark?