Remove Data Workflow Remove Events Remove Metadata
article thumbnail

Bringing The Power Of The DataHub Real-Time Metadata Graph To Everyone At Acryl Data

Data Engineering Podcast

Summary The binding element of all data work is the metadata graph that is generated by all of the workflows that produce the assets used by teams across the organization. The DataHub project was created as a way to bring order to the scale of LinkedIn’s data needs. How is the governance of DataHub being managed?

Metadata 100
article thumbnail

Data Engineering Weekly #198

Data Engineering Weekly

[link] Netflix: Netflix’s Distributed Counter Abstraction Netflix writes about scalable Distributed Counter abstractions for accurately counting events across its global services with millisecond latency. Due to the platform's diverse user base and workloads, Canva faced challenges maintaining visibility into Snowflake usage and costs.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

Data Engineering Weekly #196

Data Engineering Weekly

The challenges around memory, data size, and runtime are exciting to read. Sampling is an obvious strategy for data size, but the layered approach and dynamic inclusion of dependencies are some key techniques I learned with the case study. Passes include app-brain-date networking, birds of a feature, post-event parties, etc.

article thumbnail

Introducing the dbt MCP Server – Bringing Structured Data to AI Workflows and Agents

dbt Developer Hub

These tools can be called by LLM systems to learn about your data and metadata. Remember, as with any AI workflows, to make sure that you are taking appropriate caution in terms of giving these access to production systems and data. For AI agent workflows : Autonomously run dbt processes in response to events.

article thumbnail

Toward a Data Mesh (part 2) : Architecture & Technologies

François Nguyen

TL;DR After setting up and organizing the teams, we are describing 4 topics to make data mesh a reality. Data As Code is a very strong choice : we do not want any UI because it is an heritage of the ETL period. ” He/She is managing triggers, he/she needs to check conditions (event type ?

article thumbnail

Building Linked Data Products With JSON-LD

Data Engineering Podcast

Summary A significant amount of time in data engineering is dedicated to building connections and semantic meaning around pieces of information. Linked data technologies provide a means of tightly coupling metadata with raw information. What is the level of native support/compatibiliity that you see for JSON-LD in data systems?

Building 189
article thumbnail

Addressing The Challenges Of Component Integration In Data Platform Architectures

Data Engineering Podcast

Developing event-driven pipelines is going to be a lot easier - Meet Functions! Data lakes are notoriously complex. Memphis Logo]([link] Developing event-driven pipelines is going to be a lot easier - Meet Functions! Developing event-driven pipelines is going to be a lot easier - Meet Functions!