Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

Data Innovators Exchange

Public • 170 • Free

59 contributions to Data Innovators Exchange
Modelling Address Data
The question arose as to the correct way to model address data. Michael Olschimke explained his point of view and shared his idea on how to proceed in the latest session of Data Vault Friday. You can find the video either here in the classroom or below. How would you model address data? Would you do it differently to Michael?
6
3
New comment 12h ago
Modelling Address Data
2 likes • 5d
Thanks for sharing!
🚨Klarna's AI Move: Cutting Jobs for Efficiency?🚨
Klarna is planning to halve its workforce, leveraging AI to boost productivity ahead of a potential IPO. They've already trimmed from 5,000 to 3,800 employees, with more cuts likely. **Question for you:** What do you think of this from an ethical perspective? Would you let AI run your team? 🤔
4
4
New comment 6d ago
1 like • 6d
@Christof Wenzeritt You make a good point about AI being used as a reason for layoffs, especially with an IPO on the horizon. The real reasons behind these job cuts might be more complicated than what's being shared publicly. But I also think this could become a more common scenario in the future as companies increasingly leverage AI to streamline operations and cut costs.
What Was Your First Experience Working with Data? Some Fantasy Football experts here?
Hey everyone, I'm curious to know—what was your entry point into the world of data? For me, it all began with Fantasy Football. I wanted to create my own stat analytics, so I started querying player databases using an API. There’s so much to analyze in American Football, and with our fantasy draft kicking off this Sunday, I’m reminded of those early days. So, what was your first data project? How did you get started? Looking forward to hearing your stories!
8
9
New comment 6d ago
2 likes • 9d
@Volker Nürnberg Not a lot of things I’ve heard of, but that’s awesome—starting with a Commodore 64 is legit old-school cool! 😄 It’s wild to think how far we’ve come from those days. Sounds like you’ve been into data way before it was the thing.
0 likes • 6d
@Richard Sklenařík I agree, that’s a really cool story! It’s amazing to see how you and your classmates handled such an important project back then. I wish they would do stuff like this more often nowadays.
Let's build RAG in watsonx with optimal accuracy and cost​
Explore different options for building RAG with IBM watsonx and learn advanced techniques for achieving accurate and cost-effective solutions. Thursday, 29th August | 3pm-3:45pm AEST Register Link: https://info.techdatacloud.com.au/ibm-ai-webinar-series-lets-build-rag-in-watsonx-with-optimal-accuracy-and-cost-aug2024
8
5
New comment 8d ago
1 like • 12d
This sounds really interesting! I've never used IBM watsonx before. Will this session cover the basics for beginners, or is it more advanced?
0 likes • 8d
@Sheng Lan wanted to follow up on this one. It was a really cool session on Thursday!! Do you have a recording link for it?
Data Cleaning - What tools do you use?
Interest to hear what tools you use to Master Messy Data? This webinar on Snowflake Integration with Gigasheet might be of interest if you use Snowflake. Some of the points covered include: Gigasheet enables easy integration with Snowflake, allowing Data Engineers to work with large datasets without needing advanced coding skills like SQL or Python. The platform provides powerful, user-friendly tools for cleaning and transforming messy data, including features like cross-file lookups and data type corrections, making it easier to prepare data for analysis. Designed to handle massive datasets, scaling up to billions of records while maintaining a familiar spreadsheet interface, allowing Data Engineers to perform complex operations on large-scale data effortlessly. Self-Service Data Access, reduces dependency on Data Engineering teams for routine data queries, and empowering business users to perform basic data analytics independently. Security and Compliance: With role-based access controls, data retention policies, and SOC 2 compliance. What tool would you recommend?
6
2
New comment 8d ago
Data Cleaning - What tools do you use?
0 likes • 8d
I agree with Tim—I usually stick to SQL and Python, but Gigasheet looks like a great option for those who need a simpler tool. Worth checking out!
1-10 of 59
Lorenz Kindling
5
298points to level up
@lorenz-kindling-9486
Senior BI Consultant. Expert in Data Warehousing, Data Vault 2.0, and automation. Based in Cologne, I love traveling.

Active 4d ago
Joined Jul 1, 2024
powered by