Subscribe for all the latest news for developers on AI, Agents and MCP curated by the Langflow team.
Share
AI++ // context engineering with Manus, securing MCP servers & more
Published 4 days ago • 4 min read
Great news in generative AI in the last week, advanced or experimental models from Google and OpenAI scored gold medals in the International Mathematical Olympiad. I never liked taking maths tests, so I'm very happy to see LLMs take up that challenge for me.
This edition of AI++ looks at what makes up an agent under the hood, considers the parts of MCP you might not have thought about (LOL security), and has a couple of pieces on the way we should think about what we're building with AI today.
Phil Nash Developer relations engineer for Langflow
Yichao 'Peak' Ji, Manus Manus is a general AI agent that has captured the attention of many in the AI field, so any lessons we can learn from their team are super valuable. This article covers techniques like taking advantage of caching, using the file system to store context, and manipulating attention to keep the agent on track.
A video tutorial that shows you how to build an application with Langflow that uses an LLM to compare a resume with a job description. Watch out for the data manipulation early in the flow and the detailed prompt used to drive useful output.
Samuel Colvin, Pydantic In his talk from the AI Engineer World's Fair, Samuel demonstrates an agent application with an MCP connection that not only shows important features like streaming logs and observability, but goes into detail on sampling, an underrated feature of MCP.
Den Delimarsky, GitHub A deep look into the security requirements of MCP servers, including how the OAuth 2.0 implementation works as well as considerations around authorization, secrets, and observability. It's a high-level overview of many things that you'll need to keep in your head when you build your own MCP server.
Utkarsh Kanwat Utkarsh might be betting against AI agents, it's all in the math and it doesn't even need an experimental model to explain it. The good news is that he has solid ideas on how to build systems that will succeed with the current crop of models.
Simon Willison This is a fun trip through using the recently publicly available GitHub Spark to build an app with Spark that documents how Spark works. It is both an education in vibe coding and a detailed look under the covers of Spark, including extracting the very detailed system prompt, including its whole design philosophy and how it is not supposed to act as a chat application.
Geoffrey Litt This is a really interesting take that argues we should be moving beyond chat as the default interaction between humans, AI and the tasks we are trying to accomplish to Heads-Up Displays (HUDs) that give us the data we need in the format we need it at the right time. This is something that DataStax's Dieter Randolph brought up in his article on integrating AI into his coding workflow; using AI to generate a debug panel in an application to help understand applications better as you work on them. Let's think beyond the copilot!
Shoshannah Tekofsky The AI Village is an environment where four agents–Gemini 2.5 Pro, Claude 4 Opus, Claude 3.7 Sonnet, and OpenAI o3–collaborate or compete on different tasks. In this story, they organised their own event that really took place. This story won't help you develop agents, but it is a fascinating look at what they can achieve, even with multiple failures. The perspective from Larissa Schiavo, a human who volunteered to facilitate the event, is also a fun read.
🧑💻 Code & Libraries
am-i-vibing is a library that lets you detect whether your app is being run in an agentic coding environment so you can provide different outputs to AI agents. An interesting experiment to say the least.
Homebrew’s MCP Server gives your coding agents access to run Homebrew commands locally.
Augments is an open-source MCP server for up-to-date documentation for 92 different frameworks, helping your agents write accurate and idiomatic code.
🔦 Langflow Spotlight
This week I wanted to share an absolute workhorse of a Langflow component. You'll see it in the resume analyzer video shared above, you'll see it in any RAG workflow, you'll see it anywhere you need to bring in documents from your local system or via the Langflow API; it's the File component.
The File component allows you to upload files from your system for use within your flows. Any time you need some data from a file right now, or to ingest to use later, the File component is there for you.
The File component, the first step in the RAG ingestion flow of your dreams
Missed the live stream? Subscribe to The Flow wherever you like to get your podcasts
5:30pm PDT, July 31st - the Hacking Agents Bay Area July Meetup is at the AWS Builder Loft in San Francisco with talks on onboarding devs to APIs and SDKs and unlocking your archives with Agentic OCR and Multimodal AI
Welcome to a new AI++! We're going to be packing this newsletter full of the best articles, tutorials, events and code for everything AI, agents and MCP. 🛠️ Building with AI, Agents & MCP Langflow Launches 1.5 with Support for Docling and Windows Rodrigo Nader Langflow 1.5 brings a simplified, stable foundation for building AI workflows. With clean core components, improved execution, and a smoother UI, Langflow keeps evolving for real-world use. Langflow Desktop is now available for Windows,...
Welcome to a new AI++! You might have noticed this is the first you've received in a while (or ever). The developer relations team at Langflow have recently revamped the newsletter, with a new design and a new outlook which we hope you're going to enjoy. We're going to be packing this newsletter full of the best articles, tutorials, and code for everything AI, agents and MCP. Let's get into the latest that's happening in the world of AI for developers. Read on for articles about making the...