Exploring the Realm of Scala

Overview

From its elegant syntax to its robust capabilities in building scalable and performant applications, Scala serves as the cornerstone of our development ecosystem. In this article, we will explore what I consider to be the heart of the LotusFlare DNO Cloud™, namely, how Scala is used.

We invite all passionate and talented individuals to engage with this blog and join us in reshaping the future of software engineering. Consider this not just reading but an opportunity to explore open positions in our dynamic team.

What is Scala?

Scala is a powerful and versatile programming language that has gained popularity because it seamlessly blends object-oriented and functional programming paradigms. Its widespread adoption can be attributed to its versatility and expressiveness, establishing itself as the go-to choice for many use cases. Scala excels in crafting concurrent and distributed systems from batch and streaming to big data processing.

Scala offers a vast ecosystem, seamlessly merging the Scala and Java worlds. It enables developers to express complex ideas with brevity, providing a fusion of Object-Oriented Programming (OOP) and Functional Programming (FP). In the words of Martin Odersky, Scala’s creator, it encourages “functions for the logic and objects for the modularity.”

Scala stands tall not just for its interoperability with Java but as a robust ecosystem in its own right, boasting an array of libraries and frameworks that cater to diverse domains. Tools like Apache Spark, Apache Flink, and Akka Streams make Scala a go-to choice for big data processing. Additionally, libraries like Akka, Cats, and ZIO (which are dedicated to asynchronous and concurrent programming) add depth and richness to the language’s ecosystem. It’s the intrinsic strength of Scala, both independently and in collaboration with Java that underpins its popularity among developers.

The statement “Scala is a complex programming language” holds some truth, but the learning curve proves to be a valuable investment. Drawing from my prior experience as a predominantly Java engineer before joining LotusFlare, I can attest to this sentiment.

The learning curve for Scala is much steeper than that of Java. Scala introduces a more complex syntax and adheres to a less-is-more approach to coding, posing a significant challenge for beginners. In contrast, Java stands out for its ease of learning – a well-structured language with a relatively straightforward syntax, especially when compared with Scala.

Nonetheless, Scala’s versatility allows for a gradual learning progression. Starting with the simpler aspects, beginners can incrementally explore more advanced concepts such as type classes, unconventional use of generics, Readers, Kleislis, some EitherT, and others. This progression unfolds as a journey into the intricacies of the language, showcasing its depth and richness.

Why did we choose Scala?

Back in the day, Scala emerged as an excellent choice for those seeking better Java, offering seamless integration with Java and access to thousands of Java libraries. Additionally, thanks to Apache Spark, Scala became linked to data processing. 

To recap our focus, Scala is predominantly utilized in the development of data streaming applications. These applications, characterized by their data-intensive and distributed nature, are tailored for the Kafka messaging system, facilitating the real-time streaming of data across various endpoints. Within the realm of data streaming, Scala is employed with Kafka streaming libraries with teams opting for either Alpakka or fs2 based on their preferences. Some applications leverage Apache Camel for simplicity and rapid deployment, particularly when there’s a need to bootstrap an application swiftly. The evolution of the LotusFlare toolset initiated with Alpakka as the primary Scala tool followed by the introduction of fs2 to harness functional streaming capabilities. As we expanded, the inclusion of Apache Camel addressed efficiency concerns and provided a quick solution for specific use cases, ensuring a dynamic and efficient development environment. The landscape is ever-evolving as we navigate through diverse challenges and adapt to the dynamic spectrum of business requirements. With Scala at our disposal, LotusFlare developers have the confidence to access the right tools to tackle complex challenges at high speed.

How do we use Scala in LotusFlare?

Let’s take a closer look at how Scala in harmony with Kafka shapes the blueprint of our technological narrative. Picture Scala and Kafka as a “dynamic duo”, weaving a tapestry of complexity, sculpting applications deeply rooted in Event-Driven Architecture. This is the essence of how Scala unfolds within teams at LotusFlare.

It is crucial to establish that Scala takes a secondary role to the primary language, which is undeniably Lua, within our array of company tools. This doesn’t diminish the importance of Scala in the grand scheme of what we develop. Scala serves as the core of our entire ecosystem. Scala, when coupled with Lua microservices, creates a powerful synergy. To comprehensively explore Lua’s capabilities, I invite you to delve into our dedicated blog post, available here.

Let’s shift our focus back to Scala. As previously highlighted, Scala stands as our solution for various critical aspects such as background processing, data analysis, constructing data pipelines and building a robust data integration platform. Scala is pivotal when constructing microservices tailored for heavy-weight data processing. These microservices serve as the backbone of LotusFlare DNO Cloud data pipelines, contributing to the LotusFlare ecosystem.

How do we use Scala within the LotusFlare Platform team?

As a part of the LotusFlare Platform team, I can shed light on how we utilize Scala. Let’s explore the specific tools we employ and the challenges we aim to solve, the use case of our Platform team, and how they leverage Scala to overcome challenges and achieve impactful solutions.

The primary objective of the product crafted by the Platform team revolves around enhancing the development experience for others. Our focus extends beyond the creation of data streaming applications to the process of shaping a versatile framework (or builder) that enables others to construct their data pipelines. It facilitates not only the implementation of streaming applications but also API development for the creation and management of diverse data flows, incorporating our services into the company’s ecosystem. Consequently, this introduces many interesting aspects, solving the complexities of distributed systems, tackling concurrency issues, and delving into various other performance considerations.

I will now delve into one of our services to explain its construction and the specific Scala features it employs. Without providing specific details or the application’s name, the toolset has been applied to an application that can be likened to an HTTP Sink Kafka Connector, but that offers additional functionalities to enhance its capabilities.

The primary objective of this application is to facilitate the creation of a Kafka Consumer to ensure reliable delivery to the specified HTTP endpoints. Beyond the conventional Kafka Consumer functionality, it operates as a builder that empowers users to configure intricate data pipelines such as Kafka-HTTP processing. The application orchestrates the execution of all configured workloads as isolated tasks to enhance the flexibility and efficiency of data processing.

The Kafka Consumer implementation lies at the heart of this application and our Platform team leverages ‘alpakka-kafka’ to handle such functionality with plans to transition to the fork implementation, Apache Pekko Connectors Kafka connector. To streamline the usage of ‘alpakka-kafka,’ we encapsulate its complexities within custom libraries and wrappers. The intricate logic of the consumer flow predominantly relies on Monix and a monadic approach and employs the composition of flat maps.

Monix is valuable in managing concurrency to ensure a high throughput without unnecessary complexity. The team employs Circe as our server tool for Kafka deserialization and JSON manipulation before sending events to the HTTP endpoint. As the HTTP client, we adhere to STTP, which serves as the abstraction layer utilizing the OkHTTP client as its backend. The combination of these tools equips our applications to maintain and enhance the core service within our framework of construction data pipelines.

Conclusion

Recognized as a leading functional programming language, Scala boasts extensive resources for learning and practicing functional programming. Within the Scala community, Scala is often labeled as a less promising language for the future decades. This perception is attributed to various factors, such as minimal marketing efforts, limited community management, and historical issues with tooling. While no language is without its imperfections, Scala’s setbacks appear minor in the grand scheme. Despite potential improvements in tools and library stability, the overall benefits and opportunities Scala offers far outweigh these considerations. Scala is, in my view, a powerhouse because it fosters continuous learning, growth, and robust development experiences. Ultimately, Scala’s deep integration within numerous projects, continuous evolution, and growing strengths suggest a promising and expansive future.

Oleksandr Balyshyn
LotusFlare

Explore current openings here.