Count David Cramer among the skeptics about the current ability of generative AI (GenAI) to do the jobs currently being done by human developers and engineers.GenAI tools are not yet reliable enough to write much software that can be used in production, he said in this episode of The New Stack Agents livestream.But Cramer, founder and chief product officer of Sentry, a maker of application monitoring software, is also willing to mute that skepticism enough to go to market with AI-based products.In June, Sentry’s Seer, which uses GenAI capabilities for debugging and identifying the root causes of bugs, went into general availability. The company arrived at the decision to start exploring what large language models (LLMs) could do for its customers by thinking about its core mission.“Sentry is really good at helping you debug whatever’s wrong with your software in production,” Cramer told Alex Williams, TNS founder and publisher, and Frederic Lardinois, TNS senior editor for AI, co-hosts of this episode. “Because we capture all this information, can [we] make it better to debug things? Could we make it so we can actually just go beyond debugging things?”AI’s Potential To Mimic Senior Engineering ExpertiseBy leveraging LLMs to enhance Sentry’s solution, a user could benefit from LLMs’ large-scale pattern matching, Cramer said. “We found that the technology was super good for our business. One, a core part of Sentry is aggregation. I have an error. I have another error. Are they the same error?”And also, because of the traces Sentry collects, the LLMs “were actually able to summarize a lot of data in the system, and it’s able to phenomenally [identify] root cause problems, like in a way that you would do as a human.”He offered an analogy, pointing to friends he’s collected over the years who are staff or principal engineers in large companies. “They don’t seem to do a lot. They get paid enormous amounts of money, and they generally hate their jobs,” he noted. “For a long time, I didn’t understand why, why they exist, like, why companies would value them so highly.“And then I realized what it is: for true seniority in engineering. It’s about being able to theory craft how systems work. And the only way you can do that is if you have domain experience in those systems. And a lot of these big companies, the domain experience is really just that company. It’s very specific to that company.”He used this analogy, he said, to highlight an area where AI may ultimately prove beneficial: the ability to summarize what’s going on in a system very quickly. “You can actually do a lot of that with LLMs now.”Why AI Still Requires a ‘Human in the Loop’Which isn’t to say, however, that AI is ready to replace flesh-and-blood engineers, Cramer acknowledged. Again, that skepticism.“We are very much at the phase, and I think we’re gonna be at this phase for quite a long time where it is truly human in the loop, nonstop,” he said. “But it’s a great enabler for moving faster on some things for code review.”“There’s dozens of companies, including us, that have an investment in that space, and it doesn’t actually have to be correct, because there’s still a human there. If it can find some bugs that maybe you didn’t have to find, or you didn’t find yourself, that’s a net win — net positive in productivity.”The Current State of AI-Powered Patch GenerationThe innovations that leverage AI technology will, as innovations often do, improve incrementally, Cramer suggested. So far, he said, the product results of Sentry’s experiments with AI “are not designed for full autonomy. There are some cool things that we can do autonomously.”For instance, while AI-fueled observability can usually identify the root cause of a bug, “the patch generation is awful. I will be the first to tell you that it is not that good, and everybody’s patch generation is awful. And that’s fine. That’s the state of the technology.”But, he added, “what we can do today is what we can do today. OK, maybe we can much more effectively get you to that [root] cause, so you can debug faster.”Sentry’s Shift to Fair Source LicensingCramer also weighed in on the ongoing debate companies have about keeping their innovations open source. Sentry, he noted, is 15 years old and has long ago moved away from open source licensing to fair source.In 2023, the company relicensed Sentry under Functional Source Licensing. An FSL converts to Apache 2.0 after two years. “Actually, the majority of our software is Apache license at this point,” Cramer said.“The open source thing is not necessarily the right thing to stick into a business model,” Cramer said. “It just complicates everything. It commoditizes the market.”The decision came about, he said, because “I was getting plagued with people trying to take Sentry and commercialize it that were not part of our organization, and then were not part of the open source community there, to be honest with you, as other just commercial entities, they were trying to monetize it.”It made him mad, he said. “I’m like, ‘Look, we’re just going to stop them.’ We’re going to change the license to something that we inherently believe achieves the goals we have. We want everybody to use our software. We don’t want people getting a free ride, like, especially other venture-backed companies, to just take our software and commercialize it.”The goal, he said, was to avoid a situation with multiple versions of Sentry’s product — “open source, free version and then the really good version that everybody actually wants, that cost, like an arm and a leg or whatever. And so we came up with this thing called fair source, which, the TL;DR: It’s a license that has very few strings attached.”Check out the full episode of The New Stack Agents for more from Cramer about fair source licensing, how Sentry is approaching AI with both skepticism and curiosity, the new tech jobs that AI might create and more.The post Sentry Founder: AI Patch Generation Is ‘Awful’ Right Now appeared first on The New Stack.