by Christian Silva & Marc Montanari
Marc and Christian invite you to join them in their coffee break with guests to discuss their experience in the always-growing data field. From technical discussions to tangible impacts made by data, this podcast guides you in your journey to thrive as a data sales engineer, solution architect, or data analyst/consultant. LinkedIn - https://www.linkedin.com/company/the-data-coffee-break-podcast/ Website - https://www.thedatacoffeebreakpodcast.com/This podcast represents our views and not the one from our employers
Language
🇺🇲
Publishing Since
10/18/2022
Email Addresses
0 available
Phone Numbers
0 available
September 3, 2024
Send us a textEver wondered how cycling from London to Paris could inspire an in-depth discussion about AI? Join us as Marc shares his scenic adventure along the charming Avenue Vert, setting the stage for a captivating exploration into the world of open-source large language models (LLMs). Christian and Marc delve into the heart of AI, examining the strengths and limitations of open models like Mistral, Meta's Llama, and Google's Gemma compared to closed systems such as GPT, Gemini or Claude...
June 20, 2024
<p>Grab your mug, it's Data Coffee Break time with Christian and Marc! ☕️</p><p>This episode, we're brewing up a hot topic: RAG vs. Giant Context Windows. You might be wondering, what in the world is RAG and why should I care? Well, settle in because we're about to break it down for you, no matter if you're an AI rockstar or just dipping your toes into the data pool.</p><p>We'll be chatting about the latest news in the wild world of AI, then diving deep into this whole RAG situation. <b>Is the future all about these Large Context Window models with millions of tokens, or will RAG still be the king of the castle?</b> We'll explore the pros and cons of each approach, so you can see which one might be the better fit for your next project.</p><p>Plus, we won't just leave you hanging with theory! We'll also be uncovering the real-world uses for both RAG and LCLs. Think of it like peeking into the future and seeing how these technologies will change the game!</p><p>So, grab your favourite drink, hit play, and get ready for a stimulating conversation that will leave you hyped about the future of AI,</p><p>Follow us on <a href='https://rebrand.ly/dcb-linkedIn'>LinkedIn</a> <a href='https://rebrand.ly/dcb-instagram'>Instagram</a> <a href='https://rebrand.ly/dcb-twitter'>Twitter</a> <a href='https://rebrand.ly/tiktok-'>TikTok</a> and <a href='https://rebrand.ly/dcb-yt'>Youtube</a> ! <br/>Music by <a href='https://tinyurl.com/4v9beebw'>Skilsel</a>. <br/>This podcast represents our views and not the ones of our employers.</p>
January 22, 2024
<p>One year after our first attempt to predict the data trends of 2023, we look back into our prediction and reflect on how this year was transformative.<br/>Then, we gear back into 2024 with a heated debate with Ivan Coelho, our infamous guest, to dissect what could be a transformative year for the Data and AI space.<br/>Tune in and listen to our take for 2024 and some tips along the way.<br/><br/>Some interesting reading:<br/><a href='https://sloanreview.mit.edu/article/five-key-trends-in-ai-and-data-science-for-2024/'>https://sloanreview.mit.edu/article/five-key-trends-in-ai-and-data-science-for-2024/</a></p><p><a href='https://towardsdatascience.com/trends-that-will-shape-the-modern-data-stack-in-2024-6b7de28335c2'>https://towardsdatascience.com/trends-that-will-shape-the-modern-data-stack-in-2024-6b7de28335c2</a></p><p><a href='https://www.datacamp.com/blog/the-top-5-vector-databases'>https://www.datacamp.com/blog/the-top-5-vector-databases</a></p><p><a href='https://towardsdatascience.com/navigating-the-ai-landscape-of-2024-trends-predictions-and-possibilities-41e0ac83d68f#b80d'>https://towardsdatascience.com/navigating-the-ai-landscape-of-2024-trends-predictions-and-possibilities-41e0ac83d68f#b80d</a></p><p><a href='https://blog.eliassen.com/projecting-the-top-trends-in-data-in-motion'>https://blog.eliassen.com/projecting-the-top-trends-in-data-in-motion</a><br/><br/></p><p>Follow us on <a href='https://rebrand.ly/dcb-linkedIn'>LinkedIn</a> <a href='https://rebrand.ly/dcb-instagram'>Instagram</a> <a href='https://rebrand.ly/dcb-twitter'>Twitter</a> <a href='https://rebrand.ly/tiktok-'>TikTok</a> and <a href='https://rebrand.ly/dcb-yt'>Youtube</a> ! <br/>Music by <a href='https://tinyurl.com/4v9beebw'>Skilsel</a>. <br/>This podcast represents our views and not the ones of our employers.</p>
Ken Jee
swyx + Alessio
Conviction
Lenny Rachitsky
New York Magazine
Pod Engine is not affiliated with, endorsed by, or officially connected with any of the podcasts displayed on this platform. We operate independently as a podcast discovery and analytics service.
All podcast artwork, thumbnails, and content displayed on this page are the property of their respective owners and are protected by applicable copyright laws. This includes, but is not limited to, podcast cover art, episode artwork, show descriptions, episode titles, transcripts, audio snippets, and any other content originating from the podcast creators or their licensors.
We display this content under fair use principles and/or implied license for the purpose of podcast discovery, information, and commentary. We make no claim of ownership over any podcast content, artwork, or related materials shown on this platform. All trademarks, service marks, and trade names are the property of their respective owners.
While we strive to ensure all content usage is properly authorized, if you are a rights holder and believe your content is being used inappropriately or without proper authorization, please contact us immediately at [email protected] for prompt review and appropriate action, which may include content removal or proper attribution.
By accessing and using this platform, you acknowledge and agree to respect all applicable copyright laws and intellectual property rights of content owners. Any unauthorized reproduction, distribution, or commercial use of the content displayed on this platform is strictly prohibited.