Decodable, the well-funded real-time data engineering and stream processing platform based, in part, on the Apache Flink open-source project, is launching a major update today that aims to make the service more attractive to large enterprises. In addition to expanding its library of connectors to enable data ingestion from more sources and a new enterprise support option, the marquee feature here is the launch of its automated task-sizing capabilities. Thanks to this, Decodable can now dynamically configure workloads as needed to optimize performance and cost.
As Decodable founder and CEO Eric Sammer told me, he believes that while stream processing itself is quickly becoming mainstream — with Apache Flink becoming the de facto standard — what happens around that isn’t quite mainstream yet, in part because until now, only businesses with highly sophisticated engineering teams were able to build on top of this technology.
“The analogy I think about is networking switches,” he explained. “We can move packets back and forth. The next iteration of that — the part that I don’t think is fully mainstream yet — is the processing capability on top of that — the ability to transform and aggregate. What I think we would have called streaming analytics 10 years ago.”
The likes of Lyft, Uber or Stripe are able to create this enterprise-grade layer of abstractions on top of these real-time data streams. Others, he argues, need a tool like Decodable to do so, especially if they want to build consumer-facing applications.
“Streaming is one piece of technology,” he also noted. “It’s a collection of Debezium plus Kafka or Redpanda or whatever — plus Flink, plus API’s and all these other kinds of things. And it’s cost prohibitive to stand up a team and operationalize that. That’s where we add value. And that’s why we focus on developer experience and self-service and enterprise features.”
As for the new task sizing feature, Sammer noted that users can tell the service how many tasks they want to dedicate to a given workload. Decodable will then scale up to this maximum number of tasks — or scale down, if warranted. For a lot of users, this may result in lower spending overall. And while that may also mean they won’t spend as much on Decodable, Sammer believes that if the company can make stream processing easier and more cost effective for its users, these users will bring more workloads to the service. “There are very few cases where you wouldn’t want fresher, more accurate data — or be able to make a better decision in real time,” he said. “So from that perspective, every time we make it cheaper for someone to run a workload on Decodable, they add more workloads.”
Decodable wants to take real-time stream processing mainstream by Frederic Lardinois originally published on TechCrunch
DUOS