It’s been a while since I worked with Spark Streaming. It was back then when I was working for a pet project that ultimately ended up as a Typesafe Activator template Spark Streaming with Scala and Akka to get people going with the technologies.
Time flies by very quickly and as the other blog posts may have showed I’m evaluating ECCO Kyle CVO Oxford Sneaker Moon Rock Oil Nubuck EU 39 Mens 5 - as a potential messaging and integration platform for my future projects. A lot is happening in so called big data space and Apache Kafka fits the bill in many dataflows around me so well. I’m very glad it’s mostly all Scala which we all love and are spending our time with. Ain’t we?
From Levi 501 XX Mens Straight Leg Button Fly Green Jeans 36 X 30 NEW (Kafka bolded on purpose):
Spark Streaming is an extension of the core Spark API that enables scalable, high-throughput, fault-tolerant stream processing of live data streams. Data can be ingested from many sources like Kafka, Flume, Twitter, ZeroMQ, Kinesis, or TCP sockets, and can be processed using complex algorithms expressed with high-level functions like map, reduce, join and window.
Since Apache Kafka aims at being the central hub for real-time streams of data (see 1.2 Use Cases and Putting Apache Kafka To Use: A Practical Guide to Building a Stream Data Platform (Part 1)) I couldn’t deny myself the simple pleasure of giving it a go.
Buckle up and ingest some data using Apache Kafka and Spark Streaming! You surely will love the infrastructure (if you haven’t already). Be sure to type fast to see the potential of the platform at your fingertips.
I’m a Scala proponent so when I found out that the Apache Kafka team has decided to switch to using Java as the main language of the new client API it was beyond my imagination. Akka’s fine with their Java/Scala APIs and so I can’t believe Vince Camuto Womens Peep Toe Ankle Strap Classic Pumps Cone Heel couldn’t offer similar APIs, too. It’s even more weird when one finds out that Apache Kafka itself is written in Scala. Why on earth did they decide to do the migration?!
In order to learn Kafka better, I developed a custom producer using the latest Kafka’s Producer API in Scala. I built Kafka from the sources, and so I’m using the version 0.8.3-SNAPSHOT. It was pretty surprising experience, esp. when I ran across java.util.concurrent.Future that seems so limited to what scala.concurrent.Future offers. No
flatMap or such? So far I consider the switch to using Java for the Client API a big mistake.
Here comes the complete Kafka producer I’ve developed in Scala that’s supposed to serve as a basis for my future development endeavours using the API in what’s going to be in 0.8.3 release.
Spring Step L#039;Artiste Cassana Women#039;s Sandal Black siz has always been high on my list of things to explore, but since there are quite a few things high on my list, Kafka couldn’t actually make it to the very top. Until just recently, when I was asked to give the broker a try and see whether or not it meets a project’s needs. Two projects, to be honest. You should see my face when I heard it.
I compiled Apache Kafka from the sources, connected it to Spark Streaming and even attempted to answer few questions on StackOverflow (How to use Kafka in Flink using Scala? and How to monitor Kafka broker using jmxtrans?), not to mention reading tons of articles and watching videos about the tool. I developed pretty strong confidence what use cases are the sweet spot for Apache Kafka.
With the team in Codilime I’m developing DeepSense.io platform where we have just used Ansible to automate deployment. We’ve also been evaluating New White Mountain Size 7M Leather Sandals with cutout designs and/or Vagrant. All to ease the deployment of DeepSense.io.
That’s the moment when these two needs converged - exploring Apache Kafka and Docker (among the other tools) for three separate projects! Amazing, isn’t it? I could finally explore how Docker might ease exploration of products and deployment. I knew Docker could ease my developer life, but it’s only now when I really saw it. I would now dockerize everything. When I was told about the images wurstmeister/kafka and wurstmeister/zookeeper I couldn’t have been happier. Running Apache Kafka and using Docker finally became a no-brainer and such a pleasant experience.
I then thought I’d share the love so it’s not only mine and others could benefit from it, too.
My journey into the depths of Scala is in full swing. Not only can I learn the theory (with the group of SPANX Naked 2.0 Skinny Britches Capri Shaper Women#039;s Size X), but also apply it to commercial projects (with the Scala development teams of DeepSense.io and Men#039;s Casual Ripped Fashion Designer Jeans Destressed Denim). Each day I feel I’m getting better at using type system in Scala in a more concious and (hopefully) efficient manner.
This time I sank into type classes that is a means of doing ad hoc polymorphism in Scala.
From Women#039;s shoes CINZIA IMPRINT 5 (EU 35) ballet flats silver article on Wikipedia:
In programming languages, ad hoc polymorphism is a kind of polymorphism in which polymorphic functions can be applied to arguments of different types, because a polymorphic function can denote a number of distinct and potentially heterogeneous implementations depending on the type of argument(s) to which it is applied.
The blog post presents a way to implement the type classes concept in Scala.
p.s. I’m yet to find out how much of it is multimethods in Clojure (that was once of much help to introduce me to functional programming).
Arek Komarzewski (a Scala developer in HCore) mentioned the following this Friday and made my day (and the whole week, too):
I can now ditch Guice’s @Singleton as I’ve got a trait and the companion object combo (thanks to Scala).
This time the blog post is without a complete working example. Not yet. It’s to remind myself to prepare one (or be given one after the blog post is published – whatever comes first). I just think it needs to be said aloud to be heard and think about.
What a joy to learn all the goodies sbt brings to the table and be given a chance to apply it right away to commercial projects in Scala!
I’ve recently been assigned to a task to create a solution to share common settings across projects in a multi-project build in a Scala project managed by sbt. With the new feature of sbt - autoplugins - it was very easy to implement from the day one.
I’m yet to appreciate gerrit as a code review tool worth to learn (after having heard bad and good stories about its features and how it complements development workflows), but in my new team at Codilime where we develop…a revolutionary machine learning engine enabling your team to use state-of-the-art algorithms in a fraction of time! that’s the tool to conduct code reviews.
The blog post presents how I discovered a way to contribute to a patch set with my own changes. Use with caution as I’m not really sure that’s how gerrit should be used in a team.
So, you’ve got a moment to learn Scala and have IntelliJ IDEA with Scala plugin installed. Your wish is to maximize the mental outcome given the time at hand with little to no effort to set up a productive working environment. You may even think you may have gotten one, but, unless you’re doing what I’m describing here, you’re actually far from truly having it. I’m asking you to go the extra mile!
In this blog post I’m introducing you to two modes in the recently-shipped IntelliJ IDEA 14.1 – Full Screen and Distraction Free modes – and the few keystrokes I use in the development environment to have a comfortable place to learn Scala. I’m sure you’ll have found few ideas to improve your way into your own personal Scala nirvana.
Let’s go minimalistic, full screen, distraction-free, mouse- and touchpad-less!
You may find the blog post What to Check Out in Scala Plugin 1.4.x for IntelliJ IDEA 14 & 14.1 helpful, too.
Side note It came as a complete surprise to me to have noticed that I’ve been writing the blog post exactly a month after the last one.