It’s becoming somewhat of a running gag that any device or object will be made ‘smart’ these days, whether it’s a phone, TV, refrigerator, home thermostat, headphones or glasses. This generally means somehow cramming a computer, display, camera and other components into the unsuspecting device, with the overarching goal of somehow making it more useful to the user and not impacting its basic functionality.Although smart phones and smart TVs have been readily embraced, smart glasses have always been a bit of a tough sell. Part of the problem here is of course that most people do not generally wear glasses, between people whose vision does not require correction and those who wear e.g. contact lenses. This means that the market for smart glasses isn’t immediately obvious. Does it target people who wear glasses anyway, people who wear sunglasses a lot, or will this basically move a smart phone’s functionality to your face?Smart glasses also raise many privacy concerns, as their cameras and microphones may be recording at any given time, which can be unnerving to people. When Google launched their Google Glass smart glasses, this led to the coining of the term ‘glasshole‘ for people who refuse to follow perceived proper smart glasses etiquette.Defining Smart GlassesMeta’s Ray-Ban Display smart glasses with its wristband. (Credit: Meta)Most smart glasses are shaped like rather chubby, often thick-rimmed glasses. This is to accommodate the miniaturized computer, battery and generally a bunch of cameras and microphones. Generally some kind of projection system is used to either project a translucent display on one of the glasses, or in more extreme cases a laser directly projects the image into your retina. The control interface can range from a smartphone app to touch controls, to the new ‘Neural Band’ wristband that’s part of Meta’s collaboration with Ray-Ban in a package that some might call rather dorky.This particular device crams a 600 x 600 pixel color display into the right lens, along with six microphones and a 12 MP camera in addition to stereo speakers. Rather than an all-encompassing display or an augmented-reality experience, this is more of a display that you reportedly see floating when you glance somewhat to your right, taking up 20 degrees of said right eyepiece.Perhaps most interesting is the neural band here, which uses electromyography (EMG) to detect the motion of muscles in your wrist by their electrical signals to determine the motion that you made with your arm and hand. Purportedly you’ll be able to type this way too, but this feature is currently ‘in beta’.Slow March Of ProgressLoïc Le Meur showing off the Google Glass Explorer Edition in 2013. (Credit: Loïc Le Meur)When we compare these Ray-Ban Display smart glasses to 2013’s Google Glass, when the Explorer Edition was made available in limited quantities to the public, it is undeniable that the processor guts in the Ray-Bans are more powerful, it’s got double the Flash storage, but the RAM is the same 2 GB, albeit faster LPRDDR4x. In terms of the display it’s slightly higher resolution and probably slightly better fidelity, but this still has to be tested.Both have similar touch controls on the right side for basic control, with apparently the new wristband being the major innovation here. This just comes with the minor issue of now having to wear another wrist-mounted gadget that requires regular charging. If you are already someone who wears a smart watch or similar, then you better have some space on your other wrist to wear it.One of the things that Google Glass and similar solutions have really struggled with – including Apple’s Vision AR gadget – is that of practical use cases. As cool as it can be to have a little head-mounted display that you can glance at surreptitiously, with nobody else around you being able to glance at the naughty cat pictures or personal emails currently being displayed, this never was a use case that convinced people into buying their own Google Glass device.In the case of Meta’s smart glasses, they seem to bank on Meta AI integration, along with real-time captions for conversations in foreign languages. Awkward point here is of course that none of these features are impossible with a run-of-the-mill smartphone, and those can do even more, with a much larger display.Ditto with the on-screen map navigation, which overlays a Meta Maps view akin to that of Google’s and Apple’s solutions to help you find your way. Although this might seem cool, you will still want to whip out your phone when you have to ask a friendly local when said route navigation feature inevitably goes sideways.Amidst the scrambling for a raison d’être for smart glasses, it seems unlikely that society’s attitude towards ‘glassholes’ has changed either.Welcome To The PanopticonExample of a panopticon design in the prison buildings at Presidio Modelo, Isla de la Juventud, Cuba. (Credit: Friman, Wikimedia)The idea behind the panopticon design, as created by Jeremy Bentham in the 18th century, is that a single person can keep an eye on a large number of individuals, all of whom cannot be certain that they are or are not being observed at that very moment. Although Bentham did not intent for it to be solely used with prisons and similar buildings, this is where it found the most uptake. Inspired by this design, we got more modern takes, such as the Telescreens in Orwell’s novel Nineteen-Eighty Four whose cameras are always on, but you can not be sure that someone is watching that particular screen.In today’s modern era where cameras are basically everywhere, from CCTV cameras on and inside buildings, to doorbells and the personal surveillance devices we call ‘smartphones’, we also got areas where people are less appreciative of having cameras aimed on them. Unlike a smartphone where it’s rather obvious when someone is recording or taking photos, smart glasses aren’t necessarily that obvious. Although some do light up a LED or such, it’s easy to miss this sign.In that article a TikTok video is described by a woman who was distraught to see that the person at the wax salon that she had an appointment at was wearing smart glasses. Unless you’re actively looking at and listening for the cues emitted by that particular brand of smart glasses, you may not know whether your waxing session isn’t being recorded in glorious full-HD or better for later sharing.This is a concern that blew up during the years that Google Glass was being pushed by Google, and so far it doesn’t appear that people’s opinions on this have changed at all. Which makes it even more awkward when those smart glasses are your only prescription glasses that you have on you at the time. Do you still take them off when you enter a place where photography and filming is forbidden?Dumber Smart GlassesAlthough most of the focus in the media and elsewhere is on smart glasses like Google Glass and now Meta/Ray-Ban’s offerings, there are others too that fall under this umbrella term. Certain auto-darkening sunglasses are called ‘smart glasses’, while others are designed to act more like portable screens that are used with a laptop or other computer system. Then there are the augmented- and mixed-reality glasses, which come in a wide variety of forms and shapes. None of these are the camera-equipped types that we discussed here, of course, and thus do not carry the same stigma.Whether Meta’s attempt where Google Glass failed will be more successful remains to be seen. If the criteria is that a ‘smart’ version of a device enhances it, then it’s hard to argue that a smart phone isn’t much more than just a cellular phone. At the same time the ‘why’ for cramming a screen and computer into a set of dorky glasses remains much harder to answer.Feel free to sound off in the comments if you have a good use case for smart glasses. Ditto if you would totally purchase or have already purchased a version of the Ray-Ban Display smart glasses. Inquisitive minds would like to know whether this might be Google Glass’ redemption arch.