<iframe src="//www.googletagmanager.com/ns.html?id=GTM-MXN9JJ" height="0" width="0" style="display:none;visibility:hidden">

The Smaato Blog

The Smaato Blog

How we use Groovy at Smaato

Posted by Pavlo Fedyna on February 8, 2017


Smaato operates one of the world’s largest mobile advertising marketplaces.  Our platform handles up to 10 billion mobile ads every day around the world. One of the challenges we, as Smaato developers, face daily is successfully integrating and communicating with the vast variety of technical interfaces used by our many partners who transact on our marketplace platform.

In this blog post, we wanted to give the technically inclined among you a glimpse into the challenges and successes we face in solving issues, in this case, related to monolithic applications by implementing extensible components using the Groovy language and also discuss why our developers chose Groovy over other available options.

Read more »

Building better software tests with Spock

Posted by Gerd Rohleder on October 14, 2016

Development is one of the key activities that build, enable and ultimately power the overall mobile environment.  Today we are sharing some key insight into an important aspect of development – testing framework – from one of our most experienced developers here at Smaato.

Developers know that choosing the right testing framework for the job can be really complicated. The jUnit Framework is an excellent Java testing tool, but for specific tests that require sample data, better tools - like the Groovy-based Spock  - are availableThis post describes how to use Spock and how it handles test data. It also explores how Spock tests can be executed with jUnit tests and maven.

How jUnit runs parameterized tests

To have a test that works with a set of test data, you need to define a data method which returns all samples in a Collection. Each sample is used to instantiate a new test class. The constructor needs suitable parameters for the test samples. Then all test methods are executed, as you can see in the code samples below:

Read more »

Speed vs. Security: Optimizing Secure Communication

Posted by Johan Beisser on April 12, 2016

Internet traffic is moving increasingly towards ubiquitous use of secure connections (HTTPS rather than HTTP). Proper handling of these secure connections in order to make them as fast as possible is an area of real concern for anyone running a web-based or mobile business.

For mobile companies like Smaatothe challenge with HTTPS is not encryption itself, but rather the certificate exchange and session key negotiation which imposes additional overhead on a small request over cellular networks. Even high speed 4G/LTE carrier networks can be problematic for the SSL/TLS handshake.

Let’s take a look at a highly simplified HTTPS request from a mobile device:

Read more »

Automated End-To-End Testing with Protractor, Docker and Jenkins

Posted by Marco Pajares Silva on March 9, 2016

It’s widely acknowledged that testing is a key aspect of software quality. Due to the complexity of modern software development, many firms have started using end-to-end testing procedures as part of their software release process.

What is end-to-end testing? Let’s start with a definition. “End-to-end testing is a technique used to test whether the flow of an application right from start to finish is behaving as expected. The purpose of performing end-to-end testing is to identify system dependencies and to ensure that the data integrity is maintained between various system components and systems.”1

A few months ago at Smaato, we decided to implement end-to-end tests inside our Publisher Platform (SPX). SPX is a Java application on the backend and a mix of Angular, JS, Primefaces and pure  HTML pages on the frontend. We apply a continuous integration system based on Jenkins to build the application and run tests.

Read more »

Getting Started With User Acceptance Tests

Posted by Gerd Rohleder on January 22, 2016

In our quest to develop the most efficient, full featured mobile-first advertising platform, it has become evident that Smaato's development team had to adopt user acceptance tests (UATs). Manual checks done directly in releases or integrations tests on small parts of the application wasn't sufficient enough to minimize risks for bugs. Besides, we couldn't afford to take a full day to do complete round trips or tests for a feature. UATs allowed us to run those crucial tests in an automated fashion and focus on quality.

Read more »

Sharing Our Experience With Scala at Smaato

Posted by Stephan Brosinski on November 26, 2015

One year ago we in Smaato's data engineering team decided to use Scala programming language instead of Java for all new applications. This decision was driven by a number of reasons. First, the software we’ve been using in this space (Spark and Kafka) is predominantly developed in Scala. We need to have a good grasp on the language to work with these tools to contribute and extend them. Furthermore, our development team was pushing to use a modern language, and Scala's functional aspects lend themselves well to the kind of problems we have to solve.

Being able to seamlessly interface with Java code allows us to leverage our existing code base and work with other teams at Smaato. This is a good moment to reflect on the transition and talk about our learnings.

Read more »

Agile at Smaato: How to Write a Good User Story

Posted by Anton Sherstiuk on November 23, 2015

Agile is eating the world and user stories are at the heart of it. There's hardly a person in the business of software development who doesn't know the famous Connextra template: "As a <role>, I want <goal/desire> so that <benefit>." Management attitude depicted by this Dilbert comic strip is fading away to oblivion.

User story is a simple concept. It's easy to start writing it and that's one of its strengths. But writing good user stories is a whole different story.

Read more »

Spark on Docker on Amazon EC2: Only the Code Tells You Everything

Posted by Dr. Stefan Shadwinkel on November 13, 2015

Our global real-time advertising platform processes vast amounts of data per second. Therefore managing, supporting, and enhancing all its tools and processes with data-driven solutions is crucial to our success.

Developing these solution requires a flexible setup that can also be easily scaled to allow testing on reasonable data sizes. One part in our current setup is to run Apache Spark on Docker on Amazon EC2 instances.

Using straight EC2 instances instead of EMR has the benefits of lower costs and being able to directly run the latest version or development builds of Spark.

In this blog post, we will look into the peculiarities of configuring Spark on Docker on EC2 and dive into some Spark code excerpts to understand Spark's behavior.

Read more »

Microservices: Are They the Right Architecture for You?

Posted by Arne Schipper on July 29, 2015

These days, many across our industry and others are talking about microservices. It’s one of the buzzwords of the moment, even though the topic as such has a far longer history. With companies like Netflix, Gilt and LinkedIn, among others, drawing attention to this architecture, many smaller companies find themselves confronting this very issue. With cloud providers offering more and more microservice support, and with tools and frameworks evolving around this topic, we at Smaato have also been asking if this is the direction we’d like to go.

Read more »

Big Data & NoSQL Meetup Hamburg with Apache Flink at Smaato

Posted by Dr. Stefan Shadwinkel on July 17, 2015

Smaato was very happy to host the spring to summer edition of the Big Data and NoSQL Hamburg (BDNSHH) meetup with two great guests from Berlin: Aljoscha Krettek and Maximilian Michels from dataArtisans, the company behind Apache Flink.

Apache Flink is an open source platform for scalable batch and stream data processing. At its core is a streaming dataflow engine that provides data distribution, communication, and fault tolerance for distributed computations over data streams. Interesting features are its custom dataflow optimizer, custom memory management, and its strategies to perform well when memory runs out.

We’ve interviewed our guests to dig deeper into Apache Flink:

Read more »