In this contributed article, Sijie Guo, Founder and CEO of Streamnative, believes that with remote work entrenched in the post-pandemic enterprise, organizations are restructuring their technology stack and software strategy for a new, distributed workforce. Real-time data streaming has emerged as a necessary and cost efficient way for enterprises to scale in an agile way. There are two sides to this coin with dual cost advantages – architectural and operational.
Introducing The Streaming Datalake
In this contributed article, Tom Scott, CEO of Streambased, outlines the path event streaming systems have taken to arrive at the point where they must adopt analytical use cases and looks at some possible futures in this area.
Video Highlights: Change Data Capture With Apache Flink
The featured video resource provided by Decodable is a webinar in which CDC experts provide an overview of CDC with Flink and Debezium. There is a growing role Change Data Capture (CDC) plays in real-time data analytics (specifically, stream processing with open source tools like Debezium and Apache Flink). CDC lets users analyze data as it’s generated by leveraging streaming from systems like Apache Kafka, Amazon Kinesis, and Azure Events Hubs to track and transport changes from one data system to another.
Video Highlights: Modernize your IBM Mainframe & Netezza With Databricks Lakehouse
In the video presentation below, learn from experts how to architect modern data pipelines to consolidate data from multiple IBM data sources into Databricks Lakehouse, using the state-of-the-art replication technique—Change Data Capture (CDC).
eBook: Unlock Complex and Streaming Data with Declarative Data Pipelines
Our friend, Ori Rafael, CEO of Upsolver and advocate for engineers everywhere, released his new book “Unlock Complex and Streaming Data with Declarative Data Pipelines.” Ori discusses why declarative pipelines are necessary for data-driven businesses and how they help with engineering productivity, and the ability for businesses to unlock more potential from their raw data. Data pipelines are essential to unleashing the potential of data and can successfully pull from multiple sources.
Streamlining Data Evolution in a Rapidly Changing World
In this contributed article, Adam Glaser from Appian believes that in a fast-changing, digital-driven world, having access to the right data at the right time is crucial. That is why, as data evolves, it must be brought together in a reliable and efficient way that creates a powerful asset, not a compliance challenge.
Striim Announces Striim Cloud, the Unified Real-time Data Streaming and Integration SaaS
Striim, Inc. announced the general availability of Strim Cloud, a fast way for customers to deliver real-time data and insights to power business intelligence and decision-making to meet the needs of the digital economy. Striim Cloud is a fully-managed software-as-a-service (SaaS) platform for real-time streaming data integration and analytics.
The Next-Level of Operationalizing Machine Learning: Real-time Data Streaming into Data Science Environments
In this contributed article, Adi Hirschtein, VP Product at Iguazio, discusses the next-level of operationalizing machine learning: real-time data streaming into data science environments. New Stack’s Streaming Data and the Future Tech Stack report (2019) show a 500% percent increase in the number of companies processing data in real-time for AI/ML use cases. And experts posit a more massive increase in the number of companies following this trend as we approach 2021.
Swim Releases New Platform for Managing Continuous Intelligence at Scale
Swim, the developer of the industry’s first open core platform for continuous intelligence at scale, today announced Swim Continuum 4.0, the newest release of its flagship product. Providing enterprises with a live window into the current state of their business by concurrently processing and analyzing streaming and contextual data, Swim Continuum 4.0 offers a comprehensive view of all aspects of operating and managing continuous intelligence applications at scale through a single pane of glass experience.
Analyze-then-Store: The Journey to Continuous Intelligence
In this technical blog for data architects by our friends over at Swim, we learn how to design modern real-time data analytics solutions. It explores key principles and implications of event streaming and streaming analytics, and concludes that the biggest opportunity to derive meaningful value from data – and gain continuous intelligence about the state of things – lies in the ability to analyze, learn and predict from real-time events in concert with contextual, static and dynamic data.