Broadcast Engineer at BellMedia, Computer history buff, compulsive deprecated, disparate hardware hoarder, R/C, robots, arduino, RF, and everything in between.
5493 stories

History of Closed Captions: The Analog Era

1 Share

Closed captioning on television and subtitles on DVD, Blu-ray, and streaming media are taken for granted today. But it wasn’t always so. In fact, it was quite a struggle for captioning to become commonplace. Back in the early 2000s, I unexpectedly found myself involved in a variety of closed captioning projects, both designing hardware and consulting with engineering teams at various consumer electronics manufacturers. I may have been the last engineer working with analog captioning as everyone else moved on to digital.

But before digging in, there is a lot of confusing and imprecise language floating around on this topic. Let’s establish some definitions. I often use the word captioning which encompasses both closed captions and subtitles:

Closed Captions: Transmitted in a non-visible manner as textual data. Usually they can be enabled or disabled by the user. In the NTSC system, it’s often referred to as Line 21, since it was transmitted on video line number 21 in the Vertical Blanking Interval (VBI).
Subtitles: Rendered in a graphical format and overlaid onto the video / film. Usually they cannot be turned off. Also called open or hard captions.

The text contained in captions generally falls into one of three categories. Pure dialogue (nothing more) is often the style of captioning you see in subtitles on a DVD or Blu-ray. Ordinary captioning includes the dialogue, but with the addition of occasional cues for music or a non-visible event (a doorbell ringing, for example). Finally, “Subtitles for the Deaf or Hard-of-hearing” (SDH) is a more verbose style that adds even more descriptive information about the program, including the speaker’s name, off-camera events, etc.

Roughly speaking, closed captions are targeting the deaf and hard of hearing audience. Subtitles are targeting an audience who can hear the program but want to view the dialogue for some reason, like understanding a foreign movie or learning a new language.

Titles Before Talkies

Intertitles from the 1920 film The Cabinet of Dr Caligari

Subtitles are as old as movies themselves. Since the first movies didn’t have sound, they used what are now called intertitles to convey dialogue and expository information. These were full-screens of text inserted (not overlaid) into the film at appropriate places. Some attempts were made at overlaying the subtitles which used a second projector and glass slides of text which were manually switched out by the projectionist, hopefully in synchronization with the dialogue. One forward-thinking but overlooked inventor experimented with comic book dialogue balloons which appeared next to the actor who was speaking. These techniques also made distribution of a film to other countries a relatively painless affair — only the intertitles had to be translated.

This changed with the arrival of “talkies” in the late 1920s. Now there was no need for intertitles since you could hear the dialogue. But translations for foreign audiences were still desired, and various time-consuming optical and chemical processes were used to generate the kind of subtitles we think of today. But there were no subtitles for local audiences — no doubt to the irritation of deaf and hard-of-hearing patrons who had been equally enjoying the movies alongside hearing persons for years.


Malcolm J. Norwood

As television grew in popularity, there were some attempts at optical subtitles in the early years, but these were not wildly successful nor widely adopted. In the United States, there was interest brewing in closed captioning systems by the end of the 1960s. In April 1970, the FCC received a petition asking that emergency alerts be accompanied by text for deaf viewers. This request came at a perfect point in time when the technology was ready, and the various parties were interested and prepared to take on the challenge.

It was at this time that deaf bureaucrat Malcolm Norwood from the Department of Education (then HEW) enters the story. He had been working in the Captioned Films for the Deaf department since 1960. Today he is often called the Father of Closed Captioning within the community. He was the perfect leader to champion this new technology and he accepted the challenge.

The FCC agreed in principle with the issues raised, and in response issued Public Notice 70-1328 in December 1970. Malcolm and the DOE brought together a team in 1971 which included the National Bureau of Standards, the National Association of Broadcasters, ABC, and PBS. They held a conference in Nashville (PDF) in December of 1971, which we can say was the birthplace of closed captioning.

First Captioned TV Program in 1972, The French Chef hosted by Julia Child. These were Open Captions and could not be turned of by the viewer.

It turns out that the technical implementation of broadcasting captions built on existing work. Over at the the National Bureau of Standards, engineer Dave Howe had been developing a system called TvTime to distribute accurate time signals over the air. This system sent a code over a video line in the VBI, using a method which eventually morphed into the CC standard. They had been testing the system with ABC, PBS, and NBC. ABC had even begun using this system to send text messages between affiliate stations.

Another system presented at the conference by HRB-Singer altered the vertical scanning of the receiver so that additional VBI lines were visible, and transmitted the caption text digitally, but visually, in those newly-exposed lines. This caused some concern among the TV set manufacturers, and thankfully the NBS system eventually won out.

After a few promising demonstrations, in 1973 PBS station WETA in the District of Columbia was authorized to broadcast closed captioned signals in order to further develop and refine the system. These efforts were successful, and in 1976 the FCC formally reserved line 21 for closed captions.

Adapting Broadcasts for Line 21

This is an index card that kicked around in my briefcase for many years. Can you spot the error?

In its final form, the signal on line 21 had a few cycles of clock run-in to lock the decoder’s data recovery oscillator, followed by a 3-bit start pattern, and finally two parity-protected characters of text. The text encoding is almost always ASCII, with a few exceptions and special symbols considered necessary for the task. Text was always transmitted as pair of characters, and has traditionally been sent in all capital letters. Control codes are also byte pairs, and they perform functions like positioning the cursor, switching captioning services, changing colors, etc. Because control codes were so crucial to the proper display of text, parity protection wasn’t enough — they were usually transmitted twice, the duplicate control code pair being ignored if the first pair was error-free.

Summary of Line 21 data
Basic Rate 503.496 kBd (32 x Horiz freq)
Grouping 2 each 7-bit + parity bit characters / video line
Encoding ASCII, with some modifications
Services odd fields: CC1/CC2, T1/T2
even fields: CC3/CC4, T3,T4, XDS
Specification EIA-608, 47 CFR 15.119, TeleCaption II

Today we are accustomed to near-perfect video and audio programming, thanks to digital transmissions and wired/optical networks. Adding a few extra bytes into an existing protocol packet would barely give us pause. But back then, other factors had to be considered. The resulting CC standard was fine-tuned during lengthy and laborious field tests. The captioned video signal had to be robust when transmitted over the air. Engineers had to address and solve problems like signal strength degradation in fringe reception areas and multipath in dense urban areas.

Pop-On CC Demo Frame from Felix the Cat

As for the captioning methods, there were a few different types available. By far the most common styles were POP-ON and ROLL-UP. In POP-ON captioning, the receiver accumulated the incoming text in a buffer until receipt of a “flip-memory” control code, whereupon the entire caption would immediately appear on-screen simultaneously with the spoken dialogue. This style was typically used with prerecorded, scripted material such as movies and dramas. On the other hand, with ROLL-ON captioning, as its name implies, the text physically rolled-up from the bottom of the screen line-by-line. It was used for live broadcasts such as news programs and sporting events. The text naturally must be delayed from the audio due to the nature of the live speech transcription process.

The Brits Did it Differently, and Implemented Teletext in the Process

Across the pond, broadcast engineers at the BBC approached the issue from a different angle. Their managers asked if there was any way to use the transmitters to send data, since they were otherwise idle for one quarter of each day. Therefore they worked on maximizing the amount of data which could be transmitted.

The initial service worked like a FAX machine by scanning, transmitting, and printing a newspaper page. Eventually, the BBC adopted an all-digital approach called CEEFAX developed by engineer John Adams of Philips. Simultaneously, a competing and incompatible service called ORACLE was begun by other broadcasters. In 1974, everyone finally settled on a merged standard called World System Teletext (WST) adopted as CCIR 653. Broadcasters in North America adopted a slight variant of WST called the North American Broadcast Teletext Specification (NABTS). Being a higher data rate than CC, teletext is less forgiving of transmission errors. It employs a couple of different Hamming codes to protect and optionally recover from errors in key data fields. It is quite a complex format to decode compared to line 21.

British Radio Equipment Manufacturer’s Association, Broadcast Teletext Specification

As for the format, teletext services broadcast three-digit pages of text and block graphical data — conceptually an electronic magazine. Categories of content were grouped by pages:

Example of Teletext Magazine Page
  • 100s – News
  • 200s – Business News
  • 300s – Sport
  • 400s – Weather and Travel
  • 500s – Entertainment
  • 600s – TV and Radio Listings

These text in these magazine pages are an integral part of the packet structure. For example, the text of line 4 in page 203 belongs in a specific packet for that page/line. Since the broadcaster is continuously transmitting all magazines and their pages, it may take a few seconds for the page you request to appear on-screen. NABTS takes a more free-form approach. The data can almost be considered a serial stream of text, like a connection of a terminal to a computer. If you need a new line of text, you send a CR/LF pair.

Summary of Teletext Data
Basic Rate 6.938 MBd
Grouping 360 bits/line, 40 available text characters
Encoding Similar to Extended ASCII, with code pages
Services Multiple page magazines, 40×24 chars each page
Specifications Europe: WST ITU-R BT.653 (formerly CCIR 653)
North America: NABTS EIA-516

The Hacks That Made It All Work

Most of my designs were for use in North America, but I needed to learn about European teletext for a few candidate projects. In Europe, page 888 of the teletext system was designated to carry closed captioning text. This page has a transparent background and the receiver overlays it onto the video. The visual result was practically the same as in North America. But it posed some problems regarding media like VHS tapes.

The teletext signal couldn’t be recorded or played back on your typical home VHS recorder. To solve this, many tapes were made using an adaptation of the North American line 21 system, but applied to the PAL video format. This method was variously called line 22 or line 25 (the confusion being that PAL line #1 is different place than NTSC line #1), but was basically the same. A manufacturer who has a CC decoder in their NTSC product can easily adapt it to work in PAL countries.

How did I get PAL VHS tapes? I asked an engineer colleague at Philips Southampton if he could send me some sample tapes for testing. His wife bought some used from a local rental store and sent them to me. This was before the days of PayPal, so I sent her an international money order for $60. This covered the price of the tapes and shipping, plus a few extra dollars “tip” for her trouble. Some weeks later, I got an email from him saying that “you Americans sure give generous tips”. His wife had received my money order for $600, not $60! It took many months, but eventually the post office caught their mistake and she returned the overage.

The Author Troubleshooting a DVD Closed Caption problem at LG in 2003

In South Korea, a colleague was involved in the captioning industry back in the late 1990s. He was asked to participate on a government panel considering the nationwide adoption of closed captioning. The final result was comical — instead of CC, the committee decided to provided extremely loud external TV speakers free-of-charge to people with hearing difficulties. Fortunately, the conventional form of closed captioning has since been adopted with the advent of digital television broadcasting.

Designing on the Trailing Edge

By the year 2000, almost all televisions had CC decoders built-in. As a result, there were a variety of ICs available to extract and process the line 21 signal. One example was from Philips Semiconductor (which became NXP and is now Freescale). As a key developer of teletext technology and a major chip supplier to the television industry, they offered a wide variety of CC and teletext processors. I developed several designs based on a chip from their Painter family of TV controllers. These were 8051-based microcontrollers with all the extras needed for teletext, closed captions, and user menus. They had VBI data slicers, character generators and ROM fonts, all integrated onto one die.

Philips Saa55xx datasheet (page 91)

I still remember discovering the Painter chip buried pages deep in an internet search one day. When I couldn’t find any detailed information, I called the local rep and was told, “You aren’t supposed to even know about this part number — it’s a secret!”. Eventually the business logistics were resolved and I was allowed to use the chip. That was the only masked-ROM chip I ever made. I can still feel the rumbling in my stomach on the day I delivered the hex file to the local Philips office. The rep and I were hunched over the computer as we double- and triple-checked each entry on their internal ordering system. Once we pressed SEND, the bits were irrevocably transmitted to the factory and permanently burned into many thousands of chips. Even though we had thoroughly tested and proven the firmware in the lab, it was nevertheless a stressful day.

Philips Painter Chip

As I developed several other designs, it became clear that these special purpose chips should be avoided if any reasonable longevity was needed. The Painter chips were being phased out, several other options were disappearing as well. The writing was on the wall — digital broadcasting was here to stay, and the chip manufacturers were no longer making or supporting analog CC chips. I decided that future CC projects had to be done using general purpose ICs. I plan to delve into that in a future article along with unexpected applications of CC technology, the process of making captions, and how captioning made (or didn’t make) the transition to digital broadcasting and media.

Read the whole story
5 days ago
Burlington, Ontario
Share this story

Detergent DRM Defeated on Diminutive Dishwasher

1 Share

Has it really come to this? Are we really at the point that dishwashers have proprietary detergent cartridges that you’re locked into buying at inflated prices?

Apparently so, at least for some species of the common kitchen appliance. The particular unit in question goes by the friendly name of Bob, and is a compact, countertop unit that’s aimed at the very small kitchen market. [dekuNukem] picked one of these units up recently, and was appalled to learn that new detergent cartridges would cost an arm and a leg. So naturally, he hacked the detergent cartridges. A small PCB with an edge connector and a 256-byte EEPROM sprouts from each Bob cartridge; a little reverse engineering revealed the right bits to twiddle to reset the cartridge to its full 30-wash count, leading to a dongle to attach to the cartridge when it’s time for a reset and a refill.

With the electronics figured out, [dekuNukem] worked on the detergent refill. This seems like it was the more difficult part, aided though it was by some fairly detailed specs on the cartridge contents. A little math revealed the right concentrations to shoot for, and the ingredients in the OEM cartridges were easily — and cheaply — sourced from commercial dishwashing detergents. The cartridges can be refilled with a properly diluted solution using a syringe; the result is that each wash costs 1/75-th of what it would if he stuck with OEM cartridges.

For as much as we despise the “give away the printer, charge for the ink” model, Bob’s scheme somehow seems even worse. We’ve seen this technique used to lock people into everything from refrigerator water filters to cat litter, so we really like the way [dekuNukem] figured everything out here, and that he saw fit to share his solution.

Read the whole story
5 days ago
Burlington, Ontario
Share this story

The False Alarm That Nearly Sparked Nuclear War

1 Share

The date was September 26, 1983. A lieutenant colonel in the Soviet Air Defence Forces sat at his command station in Serpukhov-15 as sirens blared, indicating nuclear missiles had been launched from the United States. As you may have surmised by the fact you’re reading this in 2021, no missiles were fired by either side in the Cold War that day. Credit for this goes to Stanislav Petrov, who made the judgement call that the reports were a false alarm, preventing an all-out nuclear war between the two world powers. Today, we’ll look at what caused the false alarm, and why Petrov was able to correctly surmise that what he was seeing was an illusion.

Detecting Missiles By Infrared

Stanislav Petrov pictured at his home in 2016.

Petrov was in charge of monitoring the Oko early warning satellite network, which consisted of a series of satellites in highly elliptical Molniya orbits. This orbit was cleverly chosen by Soviet scientists to allow the Oko satellites to have a grazing view of the continental United States, which presented the biggest threat of intercontinental ballistic missile (ICBM) attack at the time. By glancing across the Earth’s edge with their infrared sensors, rather than gazing down upon it, the infrared energy from hot missile exhaust could easily be spotted against the cold background of space, rather than the Earth’s surface. The aim was to cut down on false positives from phenomena such as wildfires and oil rig burnoffs, while also providing good coverage without requiring a large number of satellites.

On that fateful night in September, however, the unique orbit of the Soviet satellite was to cause a major problem. The Oko system raised an alarm shortly after midnight, indicating that a single missile had been launched from the United States. As the sirens were going off around him, Petrov almost froze. The political climate at the time was fraught, with all-out nuclear war a constant threat.

The siren howled, but I just sat there for a few seconds, staring at the big, back-lit, red screen with the word ‘launch’ on it.

The initial alarm was followed by further alerts, showing five missiles in total. Despite the indications that all hell was about to break loose, Petrov didn’t immediately pass the alert up the chain of command.

Is This Thing Broken?

With only minutes to react to a strike, time was of the essence, but things didn’t add up. Starting a nuclear war with just five missiles didn’t strike Petrov as a believable strategy, and satellite radar operators were unable to report detecting any launches. The Oko satellites were also new and relatively untested thus far. Thus, rather than report that a nuclear strike on the USSR was underway, Petrov elected to go with his hunch and report that the system was malfunctioning.

Twenty-three minutes later I realised that nothing had happened. If there had been a real strike, then I would already know about it. It was such a relief.

A diagram indicating the rough relative positions of the sun, the Oko satellite, and and the US missile field it was tasked with monitoring.

Petrov’s gut feeling turned out to be on the money, and the dogs of war were kept on a leash that evening. It was indeed a false alarm, rather than American missiles, that had caused the warning. The date of the incident happened to be right around the autumn equinox. Due to the position of the sun and some high-altitude clouds, sunlight was reflected onto the satellite’s infrared sensors, triggering the satellite to report multiple missile launches. The incident led to the Soviet Union creating a complementary geostationary satellite system in order to corroborate any indications of missile launches from the USA.

Stanislav Petrov was never rewarded, or particularly admonished for his decision. Eventually, he was demoted on a technicality for not filling out his diary while tangling with the agonising decision as to whether the Earth should burn in nuclear fire on that cold September night. He lived out the rest of his life in Russia, passing away at the age of 77 in 2017.

Disaster was thus averted by Petrov’s actions; in the hair-trigger military environment of the time, it’s likely that USSR officials warned of incoming US missiles likely would have given the order to launch, causing untold devastation. Instead, we’re left with a great story and an even more poignant lesson. Redundancy is always key in systems that deal with matters of life or death, and it’s even more polite to ensure that when said lives (or deaths) can be measured in the millions or billions. That, and that sometimes you’ve got to err on the side of caution, particularly when nuclear war is involved.

Read the whole story
5 days ago
Burlington, Ontario
Share this story

A HALO of LEDs for Every Ear

1 Share

Few things get a Hackaday staffer excited like bunches of tiny LEDs. The smaller and denser the better, any form will do as long as we can get a macro shot or a video of a buttery smooth animation. This time we turn to [Sawaiz Syed] and [Open Kolibri] to deliver the brightly lit goods with the minuscule HALO 90 reactive LED earrings.

The HALO 90’s are designed to work as earrings, though we suspect they’d make equally great brooches, hair accessories, or desk objects. To fit this purpose each one is a minuscule 24 mm in diameter and weighs a featherweight 5.2 grams with the CR2032 battery (2.1 g for the PCBA alone). Functionally their current software includes three animation modes, each selectable via a button on device; audio reactive, halo (fully lit), and sparkle. Check out the documentation for details on expected battery life in each mode, but suffice to say that no matter what these earrings will make it through a few nights out.

In terms of hardware, the HALO 90’s are as straightforward as you’d expect. Each device is driven by an STM8 at its maximum 16MHz which is more than fast enough to keep the 90 charliplexed 0402 LEDs humming along at a 1kHz update rate, even with realtime audio processing. In fact the BOM here is refreshingly simple with just 8 components; the LEDs, microcontroller and microphone, battery holder and passives, and the button. [Sawaiz] even designed an exceptionally slick case to go with each pair of earrings, which holds two HALO 90’s with two CR2032’s and includes a magnetic closure for the most satisfying lid action possible.

As with some of his other work, [Sawaiz] has produced a wealth of exceptional documentation to go with the HALO 90’s. They’re available straight from him fully assembled, but with documentation this good the path to a home build should be well lit and accessible. He’s even chosen parts with an eye towards long availability, low cost, and ease of sourcing so no matter when you decide to get started it should be a snap.

It was difficult to choose just a few images from [Sawaiz]’s mesmerizing collection, so if you need more feast your eyes on the expanded set after the break.

HALO 90’s in their cases, ready for a night out
Even the panel is aesthetically pleasing
Read the whole story
5 days ago
Burlington, Ontario
Share this story

DIY Wireless Serial Adapter Speaks (True) RS-232

1 Share

There is a gotcha lurking in wait for hackers who look at a piece of equipment, see a port labeled “Serial / RS-232”, and start to get ideas. The issue is the fact that the older the equipment, the more likely it is to be a bit old-fashioned about how it expects to speak RS-232. Vintage electronics may expect the serial data to be at bipolar voltage levels that are higher than what the typical microcontroller is used to slinging, and that was the situation [g3gg0] faced with some vintage benchtop equipment. Rather than deal with cables and wired adapters, [g3gg0] decided to design a wireless adapter with WiFi and Bluetooth on one end, and true RS-232 on the other.

The adapter features an ESP32 and is attached to a DB-9 plug, so it’s nice and small. It uses the ST3232 chip to communicate at 3 V logic levels on the microcontroller side, supports bipolar logic up to +/-13 V on the vintage hardware side, and a rudimentary web interface allows setting hardware parameters like baud rate. The nice thing about the ST3232 transceiver is that it is not only small, but can work from a 3 V supply with only four 0.1 uF capacitors needed for the internal charge pumps.

As for actually using the adapter, [g3gg0] says that the adapter’s serial port is exposed over TCP on port 23 (Telnet) which is supported by some programs and hardware. Alternately, one can connect an ESP32 to one’s computer over USB, and run firmware that bridges any serial data directly to the adapter on the other end.

Design files including schematic, bill of materials, and PCB design are shared online, and you can see a brief tour of the adapter in the video, embedded below.

RS-232 on serial interfaces was around for a long time, so we’ve definitely not seen the end of projects like this. [g3gg0]’s board would sure have made it easier to IoT-ify this old LED signboard.

Read the whole story
5 days ago
Burlington, Ontario
Share this story

VCF Swap Meet Takes Step Back To Move Forward

1 Share

When computers were the sort of thing you ordered from a catalog and soldered together in your garage, swap meets were an invaluable way of exchanging not just hardware and software, but information. As computers became more mainstream and readily available, the social aspect of these events started to take center stage. Once online retail started really picking up steam, it was clear the age of the so-called “computer show” was coming to a close. Why wait months to sell your old hardware at the next swap when you could put it on eBay from the comfort of your own home?

Of course, like-minded computer users never stopped getting together to exchange ideas. They just called these meets something different. By the 2000s, the vestigial remnants of old school computer swap meets could be found in the vendor rooms of hacker cons. The Vintage Computer Festival (VCF) maintained a small consignment area where attendees could unload their surplus gear, but it wasn’t the real draw of the event. Attendees came for the workshops, the talks, and the chance to hang out with people who were passionate about the same things they were.

Consignment goods at VCF East XIII in 2018.

Then came COVID-19. For more than a year we’ve been forced to cancel major events, suspend local meetups, and in general, avoid one another. Some of the conventions were revamped and presented virtually, and a few of them actually ended up providing a unique and enjoyable experience, but it still wasn’t the same. If you could really capture the heart and soul of these events with a video stream and a chat room, we would’ve done it already.

But this past weekend, the folks behind VCF East tried something a little different. As indoor gatherings are still strongly discouraged by New Jersey’s stringent COVID restrictions, they decided to hold a computer swap meet in the large parking lot adjacent to the InfoAge Science and History Museum. There were no formal talks or presentations, but you could at least get within speaking distance of like-minded folks again in an environment were everyone felt comfortable.

A Promising Start

If you’re going to walk around a parking lot with your arms full of gear, the end of April at the Jersey shore isn’t a bad time or place to do it. The beautiful weather certainly helped the turnout, which by all accounts, was even better than expected. In accordance with the state’s current COVID guidelines, tables were kept a minimum of six feet from each other and everyone was required to wear a mask when entering the cordoned off area. These were simple and reasonable precautions given the current situation, and nobody had a problem complying with them.

It should be said that VCF held a similar swap last year, but given that it was during the earlier stages of the pandemic, it was a more low-key affair. Even still, enough people showed up during those uncertain times that the organizers were emboldened to do it again with a stronger advertising push. With the safety precautions in place, the improving weather, and the amount of time we’ve all been stuck indoors, far more people were willing to poke their head out this time around.

That said, I couldn’t help but get the feeling the organizers were hesitant to fully commit to the event given the circumstances. The only onsite amenity offered was a single portable toilet that became increasingly crowded as the day went on, and even the table selling official VCF merchandise wasn’t fully set up until later in the morning.

Keeping your overhead as low as possible for an experimental event like this is understandable, but getting in contact with some local food trucks would have at least made sure there were refreshments available for people who had been standing outside for several hours. A number of attendees also commented that a portable ATM would have been welcome, as they ran out of cash when it turned out there were more sellers than they had anticipated.

Old Meets New

While not exactly a complaint, several people I spoke to said that they were unsure what to expect when they showed up given the ambiguous messaging of the event. Naturally it had been advertised as the “VCF Swap Meet”, but it wasn’t immediately clear if vintage computers would be the only hardware on the menu. Adding to the confusion is the fact that the term “vintage computer” tends to have a different meaning depending on how many times you’ve traveled around the sun. Are we talking about Commodore and Atari, or blinkenlights and toggle switches?

In the end, the sellers that showed up offered a healthy mix of modern and classic computers with a sprinkling of electronic components and amateur radio. Older computers did outnumber the contemporary machines by a bit, at least partially due to VCF themselves operating several tables to offload their own surplus inventory, but there were enough bins full of modern video cards and SATA hard drives to even things out.

Undoubtedly, some people would have preferred the event to cater exclusively to vintage hardware. But a more balanced approach is far more attainable, and frankly, more likely to succeed given that it will bring in a larger array of buyers and sellers. Going forward, VCF should consider succinctly advertising the event as a generic computer and electronic swap meet with a classic computing theme.

Coming Full Circle

While several notable events have tentatively scheduled dates for 2021, it’s far too early to say for sure if it will be safe to resume large scale indoor gatherings this year. We all desperately want things to return to normal, but the reality is, the threat of COVID-19 is still hanging over our heads.

With that being the case, I hope that other groups are inspired by the success the Vintage Computer Federation has found with their swap meet. It’s proof that, at least while the weather holds, not everything has to be done virtually. If one European-style hacker camp can continue in the face of the pandemic, then perhaps America’s version could be a revitalization of the classic traveling computer show. At the very least, here’s hoping that the VCF decides to continue holding these swap meets even after the coast is clear.

Read the whole story
5 days ago
Burlington, Ontario
Share this story
Next Page of Stories