An AI Pseudo-Experiment: Blocking All The AI Tools I Use for a Day

Wait 5 sec.

\ \Hello beautiful people!\Welcome to the Programming and Doodles blog! In this article, I’m going to present you the results of a pseudo-experiment: blocking all the AI tools I use and yet work as in a normal day (as a developer and a technical writer).\Subscribe to receive these straight into your inbox: https://codedoodles.substack.com/\Ever since ChatGPT and other AI chatbots and tools were released, we gradually lost the grip of our work. Using it to write and debug code, emails, and even short messages, probably resulted in this scenario. Worse yet, I have been seeing posts stating “Research shows that using AI decreases our critical thinking ability.“ That concerned me: do I still have the skills I had, back in 2020-22 before the release of ChatGPT?\ I can’t disagree: AI tools save a lot of time and boost productivity. But at what cost? If we are slowly losing the ability to think and solve problems ourselves, we’re ruining one of the greatest gifts to humanity.\However, reading this, don’t misunderstand: I don’t hate AI. I think it’s an amazing concept brought to life by programmers, and all of us use it regularly. Even now, my Grammarly extension is showing me ways to improve these sentences using AI. The only thing I dislike is the way we’re over-relying on them. Shoot, I should write another article about this!About the experimentBack to context, I was a bit disappointed in how much I use AI to do my work and decided to block every AI tool for a day. Luckily, I hadn’t downloaded any of these apps on my MacBook, so all I needed to do was get an extension to block ChatGPT, Claude, Gemini, and DeepSeek.\ This article was written simultaneously with the work, and in the editing phase, I realized it looked a bit… weird. It’s the “thinking“ process of my silly human brain, like the Reasoning of ChatGPT. I hereby ask you not to judge me from this article.Date: Feb 17th, 2025The to-do list:- Write a script to scrape Bing Search results (for a tutorial on building a scraper).- Review the PR #6 on GitHub.- Improve SEO of my personal site, chenuli.codedoodles.xyz (current SEO score was 85)- If more time’s left, design a merch product.(such a small to-do list, yes, I’m my own boss)Writing a ScriptWeb scraping in Python is one of the easiest programs you can write and is often overlooked. I have written about 3-4 articles on scraping various platforms like TikTok with Apify, but all it took was a prompt or two on ChatGPT to write the script:Write a python script using selenium and chromedriver to scrape "Trending Videos" page on TikTok. Structure: Page consists of a card-like structure which consists of trending videos on TikTok. These are inside a container with the class of tiktok-559e6k-DivItemContainer e1aajktk28 blah blah and blahBut this time it’s different. I’d write the base of the script first, test it, and optimize it according to the needs.\I spent a few minutes deciding which library to use: Playwright, selenium, or beautiful soup. Beautifulsoup seemed like the easiest and simplest option so I went with it.import requestsfrom bs4 import BeautifulSoup\ \I should then write a header mimicking a real browser request, to not be blocked by bot protection (or CAPTCHA). It’s a bit impossible for me to write it myself accurately, so I opened ChatGPT involuntarily. Scary, yes, but it was blocked for the best.\After a long time, I used Google for a sample request. What a lifesaver.headers={"User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36"}Then I should probably create a search link like the ones Bing uses. Must be similar to Google, I guess.\ \I should identify class variables (is that what you call it though) to scrape them individually.\Bing does a neat job when it comes to this. The results are in a list structure, , with each list item being a single search result. That makes things a lot easier. I can scrape all these items from the page and assign them to one variable, and then using a loop, I can clearly format the headings and links. Neat! Not a genius method, but I haven’t completely lost my ability to think, at least.completeData = soup.find_all("li",{"class":"b_algo"})Oh shoot, that’s dumb! It just uses an unnecessary loop and is not reusable either.\We can just loop the list of items itself. And let’s add some error handling too. for item in soup.find_all("li", class_="b_algo"): title = item.find("h2") link = title.find("a")["href"] if title else "" results.append((title.text if title else "No title", link)) if response.status_code != 200: print("Failed to retrieve search results") return []\That’s much better. Finally, we can add another loop to format and print the results accordingly.search_results = scrape_bing("Programming and Doodles")for title, link in search_results: print(f"{title}: {link}")\Two small syntax errors later, it works!\ \That felt awesome. Just like the good old days.Reviewing a PR on GitHubThis shouldn’t take long. It’s just a long Python script.\ \Looking at the PR, the code hasn’t interfered with the operations. However, the elements are not centered like the original one I created. But this is my 2nd time reviewing this PR and this contributor seems new. Asking for changes again would feel bad, I’d just fix it myself and give a good review.\Inspecting the code, he has done a good job. But for some reason, he must’ve put a sticky variable on the GUI. Let me remove that.\Humm, still not working. There should be something in the tkinter that supports centering elements.\I remember the rel keyword, but it’s tiring to add it to all the elements. Most of the results on Google shows the same way.\Oh, found one! We can just use the grid_columnconfigure for it. Thank you, bitRAKE on StackOverflow.\ Phew, I can survive well without AI as a developer in 2025.Improving SEOI recently built a personal website (not a portfolio), but its SEO score is a bit awful.\ My go-to way to improve SEO is always using structured data (schema markup), for rich results. If you’re unaware, it’s a form of metadata that helps search engines understand your content better, leading to enhanced search result features like rich snippets, FAQs, and knowledge panels. If your site’s SEO isn’t 100, enabling rich results would possibly make it 100.\This is something I’d ask ChatGPT to do; I’d write the content and ask it to use proper syntax. It greatly reduces the time taken, but since it’s blocked, figured I’d write it myself.\Or I can copy/paste the schema markup from my main site, codedoodles.xyz, and change it. It’s my code, anyway.\ \What’s more, I can also add a FAQ section. But I remember reading that Google’s policy states you can’t add schema markups for content that’s not visible on the website. Obviously, my website doesn’t have a FAQ section.\But that’s fine. I can still add a question like “Who is Chenuli Jayasinghe“, and the answer would be a summary of content on the website. Win-win!\ \Looks great. The score should go up after deployment.\ \Humm, not bad but it can go further up; the parent website, codedoodles.xyz got a 100 so should this. This is the point that I’d ask ChatGPT or DeepSeek for suggestions but wait, I have an advantage— I can check the codedoodles.xyz’s code to know what makes it a 100.\Opengraph? Done.\Twitter card? Done.\Schema markup? Also done.\What else! Let me add some more keywords then.\ Still the same.\Oh yes, I must have missed `alt` descriptions for images.\Nope, I have added it too. Dang, how stupid can I be? The Lighthouse report itself shows the details.\ robots.txt file is the issue. The websites codedoodles.xyz and chenuli.codedoodles.xyz use the same structure, so I should able to do some copy-paste jobs. Again.\Just like the good old days.\ Yeah! It’s a 100 now.\ \And that should be it. Designing merch has got nothing to do with AI chatbots, and I survived the work. Time to go read a book, great developer!Summing UpMy assumption of being a stupid, helpless developer without AI was false, and I’m happy about it. Although my critical thinking might have been affected (I did make a few dumb mistakes), I can still get things done. The whole process took longer than usual - especially writing that scraping script - but it felt more rewarding. Like finding your old bike in the garage and realizing you haven't forgotten how to ride it.\And sure, I opened ChatGPT involuntarily a few times, and yes, googling and copying/pasting felt a bit guilty. But hey, that's how we coded before 2022, right? Stack Overflow and documentation were our best friends, and they still work perfectly fine.\The main takeaway is that using AI tools is totally fine and understandable. But once in a while, do what I just did: block those LLMs and try to do the work yourself. You’ll feel great, believe me.\And please, don’t be like this guy on Reddit. Use your brain.\ \