Pybites Podcast
The Pybites Podcast - Insights to become a world-class developer.
Coding is only half the battle. To truly succeed in the tech industry, you need more than just syntax, you need strategy.
The Pybites Podcast is your weekly mentorship session on the soft skills and career skills that senior developers use to get ahead.
Join Pybites co-founders Bob Belderbos (ex-Oracle) and Julian Sequeira (ex-AWS) as they share real-world insights on mastering the developer mindset, crushing imposter syndrome, and navigating your career with confidence.
Whether you are a self-taught beginner stuck in tutorial hell or a senior dev looking for that extra edge, we cut through the fluff to help you build a career you love.
Website: https://pybit.es
Julian: https://www.linkedin.com/in/juliansequeira/
Bob: https://www.linkedin.com/in/bbelderbos/
Pybites Podcast
#217: Revisiting Quiet Links with Tim Gallati
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Four years after its original MVP, we revisit Quiet Links, exploring how it evolved into a production-ready AI-powered research tool.
Bob is joined by Tim Gallati and Juanjo to unpack how they integrated a Retrieval-Augmented Generation (RAG) pipeline into an existing Python application, without rewriting everything from scratch.
This episode is a reminder that impactful AI projects donβt require massive teams or cutting-edge complexity - just solid design, curiosity, and a willingness to learn!
Connect with Tim on LinkedIn: https://www.linkedin.com/in/timgallati/
Visit the Quiet Links website: https://quietlinks.com/
___
π‘π§βπ» Want to become a more focused, motivated, and effective Python developer? The Pybites Developer Mindset (PDM) Program helps you build strong habits, deepen your skills, and make real progress in just six weeks. Join us for mentorship, accountability, and a supportive community that keeps you moving forward. Start your PDM journey now! πβ https://pybit.es/catalogue/the-pdm-program/
___
If you found this podcast helpful, please consider following us!
Start Here with Pybites: https://pybit.es
The most surprising thing to me of this whole project was that it's not a huge technical challenge in Python to do what we're about to talk about. It's actually pretty straightforward. It's just knowing what that pipeline is is the real enabler here. And I want to encourage folks out there to look at this particular piece because it um even if you consider yourself a beginner, an advanced beginner or a beginner as I do myself, it's completely within reach.
Julian:Hello and welcome to the Pipelites Podcast, where we talk about Python, career, and mindset. We are your hosts. I'm Julian Seguira.
Bob:And I am Bob Eldibles. If you're looking to improve your Python, your career, and learn the mindset for success, this is the podcast for you. Let's get started. Hello and welcome back everybody to the Piwites Podcast. It's Bob Eldibles, and I'm here with uh Juanjo and Tim Galati. Welcome, guys. Hi, how are you?
Tim:Hey Bob, great to be here.
Bob:Yeah, nice to have you back. Welcome uh back as well, Juanjo. Nice to have you here too. And uh today we're going to talk about the next iteration of Quiet Links. Um so you have been on the podcast before in episode 60. We're just uh looking, and it was uh almost four years ago, uh 26th of February 2022. And then you shared with us the Quiet Links search database and the importance of having an MVP, right? Well, MVP it uh had become, and now it's MVP part two, right? Because lately uh yeah you have added AI and you did that in PDM with us, uh, with with Juanjo here, right? Uh and it was good to bring him on as well. Uh so we want to dive into how you did it, uh the rag component of it, and uh yeah, how it's helping the app and current user base make um, I guess, smarter searches. Um, but yeah, before we do that, um maybe you wanna reintroduce yourself to the audience because that's been quite a while.
Tim:It has. Uh well, thanks, Bob. It's it's always exciting to be here. I mean, there's just so much energy and um I don't know, growth that happens in everything we do together. So uh yeah, as you mentioned, my name's Tim Galotti. My day job is actually, I'm a I love my job title. I'm a cybrarian at Oracle, which is uh you know a modern day librarian. And so a project like Quiet Links is a natural extension of just who I am, what I do as a librarian. And it's more of a public service, which I'm thrilled to offer. And um, gosh, Bob, I as part of our introduction, we should remind ourselves of where we met, which was at Oracle, like I want to say going on seven years ago, um, 2019. So, you know, we've we've worked together for a long time, and it's great to be back and to meet uh Juanho and to have worked with Juanho on this project. It it's really been superb.
Bob:Excellent. So, yeah, what um maybe as a refresher, what uh Quiet Links uh does as a tool and why you build it, and uh maybe and then we can uh dive into the uh the AI enhancement uh you've built with Juanho.
unknown:Yeah.
Tim:Absolutely. So Quiet Links came about as a sort of a natural extension of a master's program I did where I was looking at the effects of noise pollution and the benefits of quiet. And I found that there was lots of research out there, but it was all disjointed, disparate. Researchers weren't really talking to each other. And also, they're despite all the great information, um, it was it was hard to track down. And it's noise pollution is something that tracks right behind uh air pollution. So they're typically together. It's ranked number two in terms of health concerns, and the biggest number that still to this day really grabs me is that the World Health Organization estimated, this is back in 2011, that just in Europe alone, one million years of healthy human life is lost each year due to noise pollution. So there's all the physiological effects, cardiovascular, sleep, all these things they they compound. So there's a lot of work done in this field. And Quiet Links was an effort to answer the calls that I was getting for finding the research that applies to people's specific use case. It could be education, um, you know, work environments, hospitals, urban planning, all these things, uh, all have huge bodies of research. So Quiet Links, initially back in 2022, uh when we did our last podcast, was a search solution to be able to search research and find it based on your topics. And now we're in this new phase of AI, and RAG was a perfect solution um for this kind of uh question of having all of these documents, right? And how do we get to the information that's relevant to us and think it through?
Bob:Interesting. Yeah, is it uh more heavily represented in Spain? It's uh quite a noisy country. Yeah, totally agree with that. As I speak, they're drilling here and stuff. I'm like, oh, already like getting nervous here. Um, yeah, so um maybe that that's super interesting. And uh yeah, congrats on you know building that because you know, as we all know, launching an MVP is not easy. Um you did your first MVP uh this week as well, right? First, maybe you want to highlight as well because we always can do wins on the show.
Juanjo:No, no, that yeah, definitely it's a win. The flow plan uh is is alive. It was initially uh tool built for me, but I decided to to to leave it out and to to give it to the audience and people who want to use it and have similar problems as myself. And of course, it's powered by AI.
Bob:So yeah, this is my win. Yeah, we'll we'll link it in the in the show notes. Yeah, thank you. So Tim, so then you had it MVP, it was running for years, and you know, you had some good traffic there. At what point did you then think like we need to have AI integrated in it? Was there like user demand or was it like scratching your own edge? What triggered that?
Tim:Yeah. Well, the the nice thing about Quiet Links was because it's a public service and free and open, like you said, it was getting quite a bit of traffic. So uh folks were getting in there, getting what they needed, which saved me time as well, uh, because folks would come to me with these questions. Um, but as AI evolved and rag solutions became sort of the talk of the town, we're a little bit past that. We're now talking about agents and such. But uh it it technically it was it just seemed like a great solution for a collection of articles that you could essentially put into this system. And, you know, what I was hearing from users in in terms of scenarios was they were almost always, they were very glad to have the search engine, but there was that extra mile that they would have to go to to find exactly what they were looking for. Because usually, especially with researchers, journalists, they want something very specific. I mean like super specific. Like a researcher could be I am very interested in the inner ear workings of the cocular um functioning of people, ages, so-and-so in education environments or something super specific. So they would be going through these articles as they always do and doing reviews and bibliographies. But the question then arose what if you were able to ask those specific questions and found the right materials that had been vetted that speak to your use case, right? So at that point, I thought, okay, let's try an MVP with this and show it to some of our power users and see what they have to say. And, you know, what can we really get out of this technology? So I was excited because AI had been around now, you know, at after ChatGPT for some time, and we're all searching for our use cases and our like uh you know, um uh power app or our our our killer app. And this seemed to be at least something worth trying. So um that's when we you know, we you and I connected, we um set out a PDM and Wanho uh came on board and uh brought what I like to say is I mean, just that technical, that expertise that what would have taken me, I don't even know if I could have done it six months or a year to understand the technical development stack, just what needs to happen um in six weeks, essentially. And it was incredible. It was very much in line with my first MVP in terms of the growth that happened and whatnot. But I was just extraordinarily surprised that we were able to deliver this. Um, you know, something that was I I was very happy to put in front of our power users in six weeks. So it was very exciting to test out.
Bob:Nice, nice. Yeah, congrats on getting that done, both of you. And uh that's where I'm going to take a backseat now and uh have you two kind of dive in a bit, maybe on the architecture, how it works, what were some of the challenges. Uh, I'm pretty sure the audience wants uh wants to know the nitty-gritty of uh what went into this. So you wanna hit upon that?
Juanjo:Yeah, why not? Yeah, let's let's let me just give a brief introduction and Tim can go a little bit deeper into the details because it at the end of the day it's it's baby. But as Tim said, RAG is like you have a knowledge database and you want to ask about that database. And you don't ask like uh you do a query in uh in a normal database where you do a select, update, whatever, but you use an LLM as a main in the middle. So you ask the LLM a specific question, and the LLM instead of going through his internal knowledge or training, it will deliver the or it will try to fetch data from a database, a vector database, it needs to be vectorized so the LLM can integrate with it, and will reply based on that. So it's different from retraining or training on uh on an LLM, which is more costly, but you leverage on existing LLMs and plug your own uh knowledge database. So this is how how it works, and Tim can now give more details on how it is, especially for quiet links and the the bits and pieces. So, Tim, you want to jump in?
Tim:Absolutely, thanks Juanho. And and just to say uh to give Guanho um a huge tip of the hat, there's a lot that goes into this, and I think the critical piece for my learning and for the eventual deployment of this app was his deep knowledge of how this pipeline actually works. And I'll share a little bit about that now. Um, but without that, this doesn't happen. And the most surprising thing, the most surprising thing to me of this whole project was that it's not a huge technical challenge in Python to do what we're about to talk about. It's actually pretty straightforward. It's just knowing what that pipeline is, is the real enabler here. And so I was surprised that this was within my reach, of course, with Juan Ho's help, but it was extraordinarily encouraging. And I want to encourage folks out there uh you know to look at this particular piece because it um even if you consider yourself a beginner, an advanced beginner or a beginner as I do myself, it's completely within reach. Um, so there's I say three I like to think of buckets of how this how this works and where Python shines, especially um for this kind of project. There's first of all, you have your collection of what Juanho was calling about your knowledge base. In this case, it was PDF documents that are open access research articles. So, first step is grab all those. Python has a library called Dockling, which was a wonderful way to essentially, I mean, it's pretty much out of the box, a way to read all the PDFs, get them chunked, get them ready to go into like the second step, which is this vector database. And for me, the beautiful learning was I knew I'd heard of vector databases. We probably all hear about these buzzwords too, embeddings, all these things. But this next process essentially takes those uh documents, creates embeddings for them. And for those who don't know, it took me some YouTubing and some conversations. It's essentially um a like a coordinate system to put numbers and a coordinate system to the text that's in there and do these proximity things. I'm not a mathematician, I'm a humanities major, so I, you know, I can't chart these things and say more, but they assign these to the all parts of the documents that are chunked and then put into this vector database. And the open AI model or whatever model you're using will help with that process, essentially. We'll get those embeddings for you. So put it all in a vector database to get to that third stage where it's searchable, it's all um, you know, in that typical uh Google way, it's all searchable so that now you can do that last process of calling that that AI system, whatever it is, grok, open AI, whatever, and send that query and allow it to uh ping against the database to get your response. And um, you know, in essence, again, to emphasize it's not the technical stack is is a flow that once you've got that, it's it's not that difficult in Python. It's it's pretty straightforward. The libraries are there for you. Uh the you know, these big um LLMs have done the heavy lifting process that you can just use their APIs to do uh to do this work. So it was really exciting. And it was also interesting, I think we'll we'll probably talk about how um some of the solutions to get it work right weren't necessarily coding solutions. It was about the system prompt and how you, through trial and error, the same way you would code or solve any problem, would want to shape that to get exactly the kind of response that you want. Because in this case, what's critical is that only the the responses only include things that are in that knowledge base and also are able to cite and quote those so you can see that this is actually a source of truth because our user base, that's their make or break point. Don't give me things that are made up. Um, you know, the word we like to use is hallucinate, but how about just take a left turn? You know, we we don't want creativity. Poetry is great, but that's you know, we don't want haikus here that are not about the actual research, right? And so there's nuance in there too of how that worked. But in essence, that's the pipeline, and it's completely in reach with things like Dockling, uh, NVIDIA's library in Python, all these things. It's right there for you.
Juanjo:Yeah, that's right. And I remember that the problem with the hallucinations with what's uh expectable and it was a quick fix uh once we started playing around with the prompt. Which is also another thing that we need to take into consideration that we are dealing with LLMs and prompting is not uh to be disregarded. It's something that we need to look into very carefully and to treat as an important thing, which it it turns out uh it was, because we were focused on the vector database, the VVH and Docklins and how we chunk it, and embeddings and the embedding model, all these kind of things that are small like decisions along the way that can make an impact on the final solution. But once you have all that settled and it was working okay, the last piece, the piece that is the orchestrator of everything, needs to be aware of what needs to be to be done. So we need to train that. It's like uh a librarian that that looks for a book uh across the street. No, look for books in this bookstore, not go to another one, which was the the major hurdle and and deploying because now I like also to work about the deployment, which I was I had to admit here. Bob can jump right back and and give us some highlights because he's the Heroku ninja. But why why on earth did you decide to do this in in Heroku?
Bob:I apologize in advance. Okay, I'm not affiliated or anything.
Tim:Yeah, yeah. Yeah, yeah. Well, I I want to jump in and say I love what you said about orchestration, Ju, because it just makes me think of how uh the system prompt is like a conductor conducting this whole thing. And our, you know, as coders, our you know, reflex is to go right to the code, like you were saying, you know, all those little bits and try and move those levers. But this is a case where you just have to take a pause, take a step back when the technical stack is working, and not look for that solution in your API call or your the way you're vectorizing things, or the way it's actually something else. And that was a huge revelation for me. Um yeah, yeah, I think it's it's a really good thing to highlight.
Bob:Yeah, sure. Yeah, I also really like the uh the whole teamwork here, right? Because you have Juanjo here, he's very on the AI in the rack, and Tim here, you piecing it all together, learning as well as a student, right? And and and you know, developer. And I did a bit of front end as well, and then it all came together, right? Like what when you when you integrated into the search box and you saw those rack results coming up, what what was that like?
Tim:Oh, wow, I couldn't believe it. I really, you know, it was it was one of those aha moments where I knew I was brash and thinking this project through. And even you know, Juan Ho really onboarded me with great documentation using AI, actually, to kind of you know get a get a wiki going. But even with that onboarding, um, I just all I could see in the initial part was like at each place where things might break down, it might not work. But when it came back, I was like, wow, I did this. Like this actually, oh my gosh. You know, like I'm seeing I'm seeing prop systems everywhere, and now there's one on this website, and it's it's returning something that's complete, like even the first results were encouraging, it was very much on topic, very much what I had hoped, uh, you know, at a at a high level. And um, I was just amazed at the the technology. Yeah.
Juanjo:Yeah, and and I also want to highlight your your uh your work, Tim, because you did a great job trying to understand everything. All the I I remember I gave to you a mining rack project, a small project, and you dissected it and asked me questions about it, which is good. It's I think is the way of going nowadays with these AI uh systems, and to try to understand what you're dealing with, uh because in that in mind, you could be able to implement it into your own application because it was a standalone application, and you integrate it with yours, and after you uh improved it, which is amazing. I I also want to highlight that that it's we are just the con you we are the conductor, we are your system prompt, but the user prompt is yours. So you you you did hear the user prompt and you did it very well.
Bob:So I'm very proud of your awesome and um have you already uh gotten any feedback in production from uh users?
Tim:We have, yeah, yeah. So uh like I mentioned, there's a number of power users that took a look at this and they were very excited, I think, right out the bat. Because what this solves for them is going back to the original problem of there are researchers and research information that's just all over the world, no one really knows what's going on, and and you the right hand doesn't talk to the left. And this was a case where what these folks really want is first of all, a place that they can understand as a source of truth, and second, that they can find what they need. And this gets them much closer to both those things. Um, you know, it was it was the first time in a while I had heard a lot of new questions that I hadn't had before. So it felt like I got deeper into their research process and the kinds of things that they're looking for. And um they started to brainstorm with me about things that are not technical solutions now to the kind of content that gets integrated in there, right? Because a lot of these articles are behind paywalls or you know, are not accessible, except or what should go in the knowledge base, all these things. Um, you know, it became a real hot button of activity. So I think, you know, overall, from a technical standpoint, it was a huge success because they weren't talking to me about UI issues or um you know things that they saw that were irrelevant or you know problems with the system. They they were just sort of saying, well, he here's some ideas to supplement this and give it like a little bit more. Um so it was a big win. Like the front end, Bob, that you talked about, that you, you know, that you helped us with, the back end that Huangho like was orchestrating, the the things we all talked about really did come together in six weeks.
Julian:It still blows my mind. Just a quick break. Let me ask you a question. How much of your last pull request did you actually write? And how much did AI write? If Copilot or ChatGPT disappeared tomorrow, would you still know how your code works and could you explain it in a code review? This is the problem we hear about the most from developers like you who reach out to us for a chat. At PieBytes, our Pie Bytes developer mindset program helps you become the developer who uses AI effectively, not the one who is completely propped up by it. Through a one-to-one coaching, real-world projects, and proper code reviews with an expert coach, not AI, you'll actually understand the Python code that you ship. If you're tired of feeling like a prompt engineer instead of a real developer, check out and apply for PDM using the link in the description. Now back to the episode.
Tim:And I wonder, Juano, like in all of this, now in hindsight, like I've been wanting to ask, like, was there anything that was surprising to you? Because I you've seen these systems before, right? And I know you've seen them succeed, maybe not get quite where they wanted to, but like whether it was a happy surprise or or something that you thought was a challenge that wasn't there, like, was there anything that popped up that was new or surprising?
Juanjo:Uh for me that the the thing that I was more afraid of it was the integration with your existing app so we don't break pre-existing apps. Uh and for me it was uh uh I was uh I I knew it was going to work because these are proven. And we you did a great job with the metadata extracting all the links and all the documentation, and and Docklins is a foolproof uh library. I like it. There are uh some other which are also good, and the vector database we chose weviate, it's not the easiest one to use, but I think it's one of the most powerful. We could have used Chrome ADB or even Postgres with uh this vector. So for me, was this how this uh my mind in terms of rack will fit into your more broad, a broader application, and it was uh a surprise that it that it worked, and is it doesn't take uh we reduce the times to get the query back and results, which is amazing. I've just tested uh a couple of minutes ago, and it's it's good, it's working, and it's good doing a good result. And you click on the link, it goes to a real document, not a fake one, which is it's it's something that uh it's it's it's very I'm happy to see that working in that way, yeah.
Bob:How is it even working on a Roku? Because Roku is uh shared hosting, right? So you don't really get that much power. So is a lot of this off offloaded to Vv8 as a service, or how how does that work?
Juanjo:Yeah, that's something that's one of the two things that we had to sort it out because uh as we we say, it works on my machine and it worked on Team's machine and on my machine, but once you want to deploy it, start things start to break apart and to and Heroku has a limitation of the Slack, it's the the the the size of the image it can store. I I don't know if you can overcome that with probably more money, but within our uh constraints, we decided to remove some of the heavy loading that we had. For example, we've already uploaded the documents. Uploading is not a concern of the Quiet Links application, it's something that can be offline and completely isolated from that. So we decided to remove from the final image. And finally it fit in. And as you said, we v8 takes care of everything with the you send the query to Vivi8, it will search and it will do the its magic, and we'll it will send back the to the LLM the links and the information the LLM needs to to build the final answer. So in a nutshell, yeah, it's we are relying on on ViviA to uh offload Heroku from the heavy lifting, yeah.
Tim:Yeah, it was a great moment in the in the app too, because like Juanho said like you're saying, Juanhu, we we had an existing app that we didn't want to break, and that was running on Heroku. So there was this moment where we started to think like, oh my gosh, are we gonna have to containerize this now and put it on, do this whole other thing, which is out of my skill set. I've always used Heroku. I'm a very light developer. Um, but the brilliance of Juanho really came through. It was just like, oh, we don't need these things that are taking up all this space and this slug for the actual app. We just use it to index. We don't need it. So what if we turn those off? And then it was equally as exciting as the moment the first prompt came through. It was like, okay, it's live and it works on the on the system that we have. And we didn't have to go through this whole piece, which I think, you know, taking a step back, one of the things that I learned that really um struck me as a developer, but I'm taking away larger in my work is um what I like to call the pause moment. So the moment where um you take a step back from that sort of dopamine hit you get from getting really good granular and being on that wheel of trying to solve a problem and just looking at a very specific thing or the things you think are the root of the problem, and take a deep breath and just um see be present to what the what is going on and um allowing maybe some other things to come in that might not be in your either to-do list or your understanding of the problem. And we kind of did that collectively, and I I think that's where Juan Ho it was offline, but then said, Hey, instead of you know doing it this way, what if we just turn them off? And it was something that we hadn't even considered, and that happened a couple of times in the project where I had to get out of the code and get out of the system I thought was really controlling things, and then the solution would kind of present itself. So um it it it takes it takes some doing and some courage, right? Because we love being in there. Part of part of the the joy of Python is like you get things to work and you can kind of get them to work fast. Um, but sometimes it's not about that. And I, you know, I think that really was probably one of the most valuable uh lessons for me in in this whole process.
Bob:Beautiful. Yeah, because we can go so fast with Python, right? But then it might also just go in faster in the wrong direction, right? So with these kind of projects, the design is is the most important thing, right? And that takes absolutely pause and reflection.
unknown:Yeah.
Bob:Great tip. Yeah. Nice. Anything else uh you want to highlight about the project and the learning process?
Tim:I think mostly I'm just so excited to see that we have a use case that has got a lot of uptake. I think that's one of the biggest challenges of trying to learn this technology is like what the heck is going to be useful? Um, you know, because we can stand up things that are pretty quick, especially with Python and and AI. But it's really exciting to see something that is is getting some traction. And um yeah, I I I just was I'm still shocked and just want to emphasize how within reach this was with this team. Um, it it's just been extraordinary uh and very energizing.
Bob:Yeah, happy the year. Yeah. And is there any ongoing maintenance? I guess that was something I was wondering with the whole rack solution. Like I should have been indexing and using a vector database. Is that something you need to refresh over time, or how does that work?
Tim:Yeah, the only thing that that needs um tending to is adding new articles. They're being published all the time. But the pipeline handles everything else. So the database, everything. Um, and and Wano, you can you can share more for sure. Um, but that's the only piece that, and that's really as we like. Um, you know, we keep an eye on the publications that are coming out.
Juanjo:Yeah, I think it's the only thing that to to incorporate more knowledge into the database and see how it scales, I think we VA scales very well with huge amount of data. And if not, we will have eventually have to look for a different solution. But so from now on, I think it's as Tim said, yes, to keep ingesting documents and new articles and keep an eye on the latest research so people uh using the uh the app can get the most uh out of it. Yeah.
Tim:And I would say try it out too. Like go to Quiet Linux. The the fun part is I I mean, there's all kinds of topics on noise. There's there was just an article published on comedy clubs and noise and comedy clubs, and how people don't find things as funny if there's certain kind of noise distractions. So you could find pretty much any topic in there um that that you like. But the comedy one really grabbed me. That that was just published the end of last year.
Juanjo:Yeah, it's good, it's good, but it's it's it's live. Uh it's this rack is what keeps the the app alive. And as you say, you have the the pipeline to ingest into the database, and you just need that to to keep it uh up to date. The rest is taken care of by Wavy8 and the Heroku part. So yeah.
Bob:Nice. So what's uh what's next for Quiet Links? Because I guess you're already thinking bigger, right?
Tim:Yeah, I I mean now that the infrastructure's in place, the next step really is talking to publishers and getting content in there. So, you know, it's a really interesting moment in publishing where they're which makes so much sense. Authors and so forth are just wanting to understand if it's going into a model, you know, should they be compensated? How are they compensated? So, you know, we're thinking through some models that we can kind of come to publishers with to ingest the content uh in various ways, link out to it and and so forth that protects the authors and also gives it uh a greater visibility through the app. So yeah, the next step is we've got that technical structure in there. So now it's kind of on this bleed leading edge kind of piece of how do AI systems work, especially something that's publicly available like this. Um, you know, how do we get bridge those gaps between the publishers and the information seekers? So um it's gonna be exciting. I think it's gonna be uh these conversations are gonna be um, I mean, dare I say, like I again, leading edge, um, you know, getting getting all of us to think about those models. And as we're able to bring them in, I think it's just only gonna it's only gonna take off from there.
Bob:Nice. Well, uh well, good luck with that. And we we'd love to hear how how that's going in uh six months from now. Um I'm going to uh grab back to the uh uh MVP question four years ago. Like I think I actually back then I need to listen back, but something like, what would you recommend, right? For people that have a passion like you do and and they want to start their MVP. Well, it was like free AI in a sense, so now it's AI is everywhere, right? So would you answer the question differently? Uh but again, I don't know what you said then, so maybe I ask that question again. Like for people wanting to, you know, build a business around their passion or something they care about, right? Um, what would you recommend? And yeah, is uh AI playing a role in that generally, or was that just specifically for you?
Tim:Yeah, yeah, I I would say um, you know, if if there's if your if your passion maps onto a a need that you know at least one other person has, go for it. Because um at the at the end of the day, it may not look like you thought it looked, but it's gonna be an exciting process to discover exactly what it is that map between your passion, the needs of others, and how you can serve them. And it's such a driver for learning. It really is, because you get to enjoy your topic and also bring others into it. So it starts to become a community and that really feeds itself. So try not to think about what you you know, what you would like others to do or be passionate about. Try to think more about your where your passions and where your uh interests overlap with with the needs of others. And that's just a great catalyst for growth. And iteratively, you'll discover exactly what that looks like in the end, what the what the potted plant looks like after you've put that seed in the soil, right? And it's gonna grow in all directions. Um, sometimes it needs to be trimmed, sometimes it needs a little more water, but it's it's been a great source of inspiration for me.
Bob:Well, don't leave it in the desert.
Tim:Unless it's cactus. Yeah.
Bob:Nice.
Tim:And Juanho, like what was it from from your side? What was it like on the mentor side to to watch someone who has something that's very near and dear to them uh with an idea for a project go through this process?
Juanjo:I think it was very, very rewarding because uh working with you was really a pleasure because you put the the effort, you you wanted to understand everything until the the very fine detail. And for me, it was this was very easy because I had uh already done some application, rag applications, even similar ones uh to yours, not in that scale, just for myself or other people. Uh but for me was it was this process of seeing how you became with uh with one project and you transform it into something bigger or something broader, something that can have more capabilities and features, but being faithful faithful to yourself, which is the the most uh important thing. You did your application. I gave you the tools, the documentation, we had our conversations, but at the end of the day, you were the you decided what to do, and that was uh amazing. And the life proof is your application. So again, congratulations and everybody test it out, please.
Bob:We'll link it. For sure. Nice. Yeah, maybe as the last question, um because yeah, you guys had a really great mentoring relationship, so but of course this was not an easy solution, right? Uh what when inevitably you got stuck and when there were you know challenges, um, how did you best cope with that?
Juanjo:Well yeah. Oh, go ahead, Marlon. No, for for my my side it was uh it was trying to to live up to the standards of the client because at the end of the day, uh I considered you a friend team, but you were we were in a relationship of client, mentor, mentoring. And for me it was very important to to be useful for you, to deliver what you needed, and try to not to do my app, but try to guide you so you can do yours. For me, that was the most challenging part, and as you say, we got stuck. But when you take distance from the problem, sometimes the the light uh goes goes on, and you say, Okay, why don't we try this and we try that, and another day you you you sort it out. But I think we need to have trust uh on each other, which we had, and that's why everything was uh smooth, perfect, and the result is there, so speaks for itself.
Tim:Yeah, and and from my side, it was just knowing that someone like Juan Ho and Juanjo was there. So I, you know, my my responsibilities were to always do my homework, to put put the time in and um to either come with a fully formed either question or issue if it came up, but I had really tried and I I was asking something to learn something, not just because I hit something and I just got frustrated. And you know, I I I would really try to go to the extra mile. And two, um was just really trying to be um thoughtful of you know Juan Ho's, how do I say it, point him in the right direction for expertise, right? Because, you know, he's such a treasure. There's so much experience there. But, you know, energy-wise, we both have limits. And so what's going to be the best use of our time? You know, do I ask him uh a syntax question about like uh, you know, a loop that I'm working on? Maybe, but you know, or is this architecture question that I'm trying to get my hands around because I don't know, is that really the best use of our time? And so I would try to really be considerate of that because Juanjo always came back with solutions and good thinking. So I just I knew that for this project to work and for me to learn, I wanted to get into the gray areas of places that I didn't know. And the way to do that was to stretch myself and try and get there. So um the most exciting thing was to then essentially come back to my mentor with these big questions or more pointed questions. And that really think drove the project. Um, and I saved the the some of the nitty-gritty other stuff that um you know I for myself, uh, that that syntax, uh, how to set up this function, how to, you know, should I use a class? Should I do this? Those kinds of questions that I I I felt comfortable myself kind of exploring. And then in a code review, it would come up or something, but I I wouldn't divert us in those those areas.
Bob:Yeah, and that's a great point. And I think people working with with us, right, in in the coaching capacity, get most value out of these design and architectural questions and and that that thinking process. Because you know, especially now with AI, syntax is cheap, right? You can look up a lot of you know Python things like the basic syntax questions. But maybe less so for you know when to use decorators or context managers, when when makes sense, maybe maybe a bit less, but that to a certain extent you can also find pretty easily. But if you come go to Juan with a question about deep rack, you know, workings where he has spent a lot of time uh on and he has a lot of experience, yeah, then you get very uh uh insightful feedback, right? And I think that's that's what um yeah, what people get the most out. Yeah, yeah, definitely.
Juanjo:It's like that. I think because as you say, you can ask LLMs for the syntax and for the for these type of things, but sometimes how to structure or architecture the project is it's bigger than than we think, and it's better if we go to someone who's done in the past and has uh let's say banged his head against the wall for before you. So the yeah, that's the the value added of of of of having a coach with you. Yeah, for sure.
Bob:I like that saying, like you often don't know what you don't know, right? So um and then by building this stuff you you start to see that, and then that's where then the uh the growth happens, I think.
Juanjo:Yeah, yeah.
Bob:Well thanks guys, and uh congratulations again, this awesome. And uh yeah, it was of course on the sideline very excited to see and very happy. Uh do we wrap up with some books maybe or you don't have anything particular?
Juanjo:Well I'm I'm reading just uh I'm reading MCP security. It's in Spanish, but it's a Spanish guy who is very knowledgeable in cyber security, but uh you know AI is pretty cool, but you also have to watch out for security, what which data you you let it leak. So it's a very interesting read, goes deep into the architecture of MCP servers, how you should uh orchestrate them architecture uh from an architectural point of view. And this is the the read uh the book I'm reading right now. So it's very pleasant and entertaining, technical as usual, but yeah, it is what it is.
Bob:Well, yeah, go ahead, Bob. No, you go because I'm still thinking of something.
Tim:But I see your bookshelf behind you, so I know there's a lot going on in your in your book world. Uh I'll I'll uh pivot and say actually, uh, besides a book, one thing that's really exciting that's happening is I am now um mentoring three people in my day job in Python, which is very exciting, you know, based on everything we've learned here. Folks who are not developers who are very eager to learn Python and and come online. So they've never touched Linux, they've never touched Git, they've never done any of this stuff. So um, you know, our books, I would say quote air quotes is essentially going to be lots of Google searching and uh, you know, that that kind of uh developing that muscle. So um, you know, I think that balance between what Google gives, what AI gives, what a mentor gives, I think is if we could consider all those books like Corpus, is gonna be really fun to see from from a mentoring perspective. So it kind of takes me back, Bob, to when you and I were first starting. And uh, you know, it was I was learning just like, you know, git ad, period, you know, like those basic things. And it was you were encouraging me, like, hey, it's out there, you know, go go check it out, and encouraging me to develop that muscle. So um, I would say that's that's a a different kind of book that we're I'm out I'm working on right now, but definitely feels kind of like full circle.
Bob:Nice. Yeah, and awesome that you're mentoring, right? That's uh giving it back, and then you're on the teaching side again, and uh yeah, you will have you will you'll see the gaps, right? And you get a better understanding, uh, even of of basics, right? So really nice. Yeah. Uh I I still haven't, I might I need to put a rag in because I still haven't found my my title, but uh I'm just reading a lot of Rust because I'm going to the Rust court myself, which is now in week three, um, with Jim Hodap. And I'm of course you know reading along for the second time there, and I'm reading a lot of Rust for the exercises um rebuilding as well for the relaunch of the platform. So that counts a little bit towards the reading, but I hope to get back to a novel. Um but uh yeah, it's pretty busy right now. Yeah, sure. Well, thanks guys. Uh it was a pleasure having you on, and thanks for sharing. Um yeah, very exciting, and uh I hope the audience got a lot of inspiration from this. Sure they they have. And uh yeah, maybe a final shout out uh to wrap it up.
Tim:Well, big shout out to Juan Ho. Thank you for being an amazing mentor and and Bob for shepherding us through. It's just great to be back with uh PDM and Pie Bytes, very exciting, and um yeah, let's let's launch into 2026.
Juanjo:Yeah, thank you, Tim. It's been a pleasure, and everybody out there, yeah. Uh go ahead and start writing your Python code. And if you have a good uh place to to learn, go check in Pybytes. Yeah, why not? It's the best place.
Bob:Great, yeah. And if you want to share your progress and you have cool app ideas, come join our community and uh tell us about them there. And uh, we'd love to hear about them and help you.
unknown:Sure.
Bob:Cool. Well, thanks, and uh have a great day. And uh yeah, we'll we'll talk in the community. Thank you. Thank you very much. Thanks, Bob. Thanks, Welcome.
Julian:Hey everyone, thanks for tuning into the Pybytes Podcast. I really hope you enjoyed it. A quick message from me and Bob before you go to get the most out of your experience with Pybytes, including learning more Python, engaging with other developers, learning about our guests, discussing these podcast episodes, and much, much more. Please join our community at pybytes.circle.so. The link is on the screen if you're watching this on YouTube, and it's in the show notes for everyone else. When you join, make sure you introduce yourself, engage with myself and Bob, and the many other developers in the community. It's one of the greatest things you can do to expand your knowledge and reach and network as a Python developer. We'll see you in the next episode, and we will see you in the community.