Broadcast Engineer at BellMedia, Computer history buff, compulsive deprecated, disparate hardware hoarder, R/C, robots, arduino, RF, and everything in between.
4582 stories
·
5 followers

Conquering The Earth With Cron

1 Share

The GOES-R series of Earth observation satellites are the latest and greatest NASA has to offer. As you might expect, part of the GOES-R job description is imaging Earth at high-resolution, but they also feature real-time lighting monitoring as well as enhanced solar flare and space weather capabilities. Four of these brand new birds will be helping us keep an eye on our planet’s condition into the 2030s. Not a bad way to spend around 11 billion bucks.

To encourage innovation, NASA is making the images collected by the GOES-R satellites available to the public through a collaboration with Google Cloud Platform. [Ben Nitkin] decided to play around with this data, and came up with an interactive website that let’s you visualize the Earth from the perspective of GOES-R. But don’t let those slick visuals fool you, the site is powered by a couple cron jobs and some static HTML. Just as Sir Tim Berners-Lee intended it.

But it’s not quite as easy as scheduling a wget command; the images GOES-R collects are separated into different wavelengths and need to be combined to create a false-color image. A cron job fires off every five minutes which downloads and merges the raw GOES-R images, and then another cron job starts a Python script that creates WebM time-lapse videos out of the images using ffmpeg. All of the Python scripts and the crontab file are available on GitHub.

Finally, with the images merged and the videos created, the static HTML website is served out to the world courtesy of a quick and dirty Python web server. The site could be served via something more conventional, but [Ben] likes to keep overhead as low as possible.

If you want to take the more direct route, we’ve covered plenty of projects focused on pulling down images from weather satellites; from using old-school “rabbit ears” to decoding the latest Russian Meteor-M N2 downlink.





Read the whole story
tekvax
9 hours ago
reply
Burlington, Ontario
Share this story
Delete

LEGO: The Kristiansen Legocy

1 Share

Whether you are young, old, or a time traveling Vulcan, something unites all of us globally: the innocent LEGO blocks that encourage creativity over spoon-fed entertainment. Have you noticed the excess of zombified children and adults alike drooling over their collective screens lately? Back in the ancient times, all a child needed to create hours of joy were plastic interlocking bricks and a place for their parents to trip over them. The LEGO Group harbored the inspiration of our childhood inventiveness, and none of it would have been possible without the founder, Ole Kirk Kristiansen (or Christiansen). The humble carpenter from Denmark forever made his mark on the little Scandinavian country, one brick at a time.

Well, maybe not at first. You see the plastic LEGO bricks we all know and love were initially made of wood. And they were also not actually bricks.

History Lesson

The year was 1932 and the Great Depression was taking its toll on the majority of the world’s inhabitants. After losing his job, Ole Kirk did what any sane man would in his situation — he began his wooden toy business. Initially, his venture was having a hard time making money, as one might expect due to the unfortunate circumstances. Keeping cost efficient designs in mind, he focused his efforts on simple wooden blocks, ducks, and tractors.

Then, things took a turn for the better and Denmark’s economy slowly but surely bounced back. The children loved Kristiansen’s wooden toys so much that his business started to prosper. Then his son, Godtfred, joined the company. Several wood factory fires later, they acquired a plastic injection molding machine to open up new materials options in 1947. It was a few years after this that we saw the inevitable system begin to take place. The LEGO system entails perfectly sized and shaped blocks that all fit together to create an ideal world. The system is still very much active in the LEGO design, and it will continue to be until the end of time. However, the end of Ole Kirk’s time came too soon. What he left behind for his son was the beginnings of a toy empire.

It’s Patent Time

Interlocking LEGO Patent

Initially, the plastic LEGO blocks that came out were not interlocking. Can you imagine LEGO blocks today not snapping together? There would never be the issue of the tiny flat pieces snapping together for eternity. Alas, it was figured out rather quickly that they needed to engineer a more sturdy design so that the block creations would not be destroyed by siblings so easily. In 1958, the same year that the father of LEGO ended up dying, they submitted the patent for an interlocking plastic LEGO block. Three years later it was approved, and they had lift-off. Fast forward three more years, and manuals started showing up in LEGO kits. The kits were becoming progressively more complicated, to the point that children would have a harder time figuring out how to build the bloody thing without the manual. This increase in difficulty for specific kits perhaps increased the enjoyment once the kit was complete; as we all know, the harder it is to do something, the more satisfying it is when you do it. The addition of the interlocking feature in the bricks truly was the turning point for the toy.

Minifigure LEGO Patent

 

 

 

 

 

 

In the early 70’s, Godtfred’s son, Kjeld, joined the family business. He created a research and development center that was essential to updating the LEGO kits as time went on. The designs of the kits kept evolving, and research showed that additional parts needed to be created to bring the sets to life even more. The next revolutionary addition to the LEGO toys was the minifigure, shown to the left. These adorable smiling figures truly brought the LEGO scenes to life, which is the whole point of this company.

 

 

 

Syntax and Semantics

As I am sure most LEGO buffs already know, the word “LEGO” comes from the Danish words leg and godt, literally meaning “play well”. This phrase encapsulates the company almost as much as their official slogan, det bedste er ikke for godt, or “only the best is the best”. But how does it work, anyway? Is it LEGO? Legos? Eggo? The grammatical confusion of how to properly write the word LEGO — both singular and plural — has frustrated a surprising amount of people. It seems that the LEGO authorities want the form of the adjective to always be singular and capitalized. Lots of American citizens, on the other hand, seem to enjoy practicing their freedom of English speech and use a plural form of the word un-capitalized. The fight on how to display the word seems to be never-ending, for some reason. Maybe it’s just because people like to argue for the sake of arguing (we’re looking at you, commenters).

The Key to Surviving as a Billion Dollar Toy Empire

LEGO has a way of catering to its customers. If you don’t have the 4016 piece LEGO Death Star proudly assembled on your coffee table right now, then can you really call yourself a true Star Wars fan? One of the many reasons that the LEGO Group was — and still is — so successful is because they knew how to evolve when times changed. Their core beginnings were very simple, but as the toy business grew more complex, so did their marketing strategies. How many of you have a LEGO Millennium Falcon sitting at home? A LEGO Indiana Jones Temple Escape set? What about the LEGO Ghostbusters Firehouse Headquarters? I am willing to bet that at least one of our readers has each of these. Let me know in the comments if I’m wrong. Alternatively, are there any antique LEGO Space Shuttles out there? A classic castle set, perhaps? These are two of the most iconic LEGO sets ever created, and the company admits to often going back to these bread and butter kits for stability. The LEGO Group does a very good job of integrating both of these components into their model to ensure that the company will continue on while being able to take risks.

The 1998 LEGO Mindstorms series is the best-selling LEGO line of all time. Instead of selling to kids, the majority of buyers for this specific product were PhD students and professional coders that wanted to mess with them, at least at first. However as time has progressed, this has been an invaluable learning tool for students and adults alike. Homebrew interfaces and designs were created, physical modifications were made, and bricks were “bricked”. We featured an article earlier about a LEGO exoskeleton controlling a robot that uses a LEGO Mindstorms NXT system. Before that even, the FIRST LEGO League was created. It draws students in to hack the Mindstorms and get more comfortable with hardware/software in general. Learning about robotics/computer science for the first time can be daunting, but it’s a lot easier when working with something you’re comfortable with. Who isn’t comfortable with LEGO?

The LEGO Group has effectively made its presence known to all the far reaches of the world, but none of that would have been possible without the ingenuity and sincerity of Kjeld, his father Godtfred, and his father’s father Ole Kirk Kristiansen. Their goal of giving every child the opportunity to have a meaningful playtime is being achieved everyday by the company that they built from the ground up.

Now, sit back and get nostalgic about the good ole days with this compilation of the classic Space LEGO toys:

 





Read the whole story
tekvax
9 hours ago
reply
Burlington, Ontario
Share this story
Delete

Adobe to announce Photoshop for iPad

2 Shares

"FINE," Adobe said in a press release. "I hope you're goddamn satisfied."

Adobe’s chief product officer of Creative Cloud Scott Belsky confirmed the company was working on a new cross-platform iteration of Photoshop and other applications, but declined to specify the timing of their launches.

“My aspiration is to get these on the market as soon as possible,” Belsky said in an interview. “There’s a lot required to take a product as sophisticated and powerful as Photoshop and make that work on a modern device like the iPad. We need to bring our products into this cloud-first collaborative era.”

Adobe refused to do so until now, claiming that iPads could not replace traditional computers for creative work. But Affinity Photo made a fool of it.

Read the whole story
tekvax
9 hours ago
reply
Burlington, Ontario
Share this story
Delete

Linux Fu: Scripting for Binary Files

1 Share

If you ever need to write a binary file from a traditional language like C, it isn’t all that hard to do. About the worst thing you might have to deal with is attempts to fake line endings across Windows and Linux, but there’s usually a way to turn that off if it is on by default. However, if you are using some type of scripting language, binary file support might be a bit more difficult. One answer is to use a tool like xxd or t2b (text-to-binary) to handle the details. You can find the code for t2b on GitHub including prebuilt binaries for many platforms. You should be able to install xxd from your system repository.

These tools take very different approaches. You might be familiar with tools like od or hexdump for producing readable representations of binary files. The xxd tool can actually do the same thing — although it is not as flexible. What xxd can even reverse itself so that it can rebuild a binary file from a hex dump it creates (something other tools can’t do). The t2b tool takes a much different approach. You issue commands to it that causes it to write an original hex file.

Both of these approaches have some merit. If you are editing a binary file in a scripting language, xxd makes perfect sense. You can convert the file to text, process it, and then roll it back to binary using one program. On the other hand, if you are creating a binary file from scratch, the t2b program has some advantages, too.

I decided to write a few test scripts using bash to show how it all works. These aren’t production scripts so they won’t be as hardened as they could be, but there is no reason they couldn’t be made as robust as you were willing to make them.

Cheating a Little

I decided to write two shell scripts. One will generate an image file. I cheated in two ways there. First, I picked the PPM (Portable Pix Map) format which is very simple to create. And second I ignored the format that uses ASCII instead of binary. That’s not strictly cheating because it does make a larger file, as you’d expect. So there is a benefit to using the binary format.

The other script takes a file in the same format and cuts the color values within it by half. This shows off both tools since the first job is generating an image file from data and the second one is processing an image file and writing out a new one. I’ll use t2b for the first job and xxd for the second.

PPM File Format

The PPM format is part of a family of graphics formats from the 1980s. They are very simple to construct and deconstruct, although they aren’t known for being small. However, if you needed to create a graphic from a Raspberry Pi program, it is sometimes handy to create them using this simple file format and then use ImageMagick or some other tool to convert to a nicer format like PNG.

There are actually three variants of the format. One for black and white, one for grayscale, and another for color. In addition, each of them can contain ASCII data or binary data. There is a very simple header which is always in ASCII.

We’ll only worry about the color format. The header will start with the string “P6.” That usually ends with a newline, although defensively, you ought to allow for any whitespace character to end the header fields. Then the X and Y limits — in decimal and still in ASCII — appear separated by whitespace. This is usually really a space and a newline at the end. The next part of the header is another ASCII decimal value indicating the maximum value for the color components in the image. After that, the data is binary RGB (red/green/blue) triplets. By the way, if the P6 had been a P3, everything would remain the same, but the RGB triplets would be in ASCII, not binary. This could be handy in some cases but — as I mentioned — will result in a larger file.

Here’s a sample header with a little bit of binary data following it:

The green text represents hex numbers and the other boxes contain ASCII characters. You can see the first 15 bytes are header and after that, it is all image data.

T2B

The t2b program takes a variety of commands to generate output. You can write a string or various sizes of integers. You can also do things like repeat output a given number of times and even choose what to output based on conditions. There’s a way to handle variables and even macros.

As an example, my script will write out an image with three color bars in it. The background will be black with a white border. The color bars will automatically space to fit the box. I won’t use too many of the t2b features, but I did like using the macros to make the resulting output easier to read.  Here’s the code for creating the header (with comments added):

strl P6  # Write P6 followed by a newline (no quotes needed because no whitespace in the string)
str $X   # Write the X coordinate (no newline)
u8 32    # a space
strl $Y  # The Y coordinate (with newline)
strl 255 # Maximum subpixel value (ASCII)

That’s all there is to it. The RGB triples use the u8 command, although you could probably use a 24-bit command, too. I also set up some macros for the colors I used:

macro RED 
  begin 
    u8 255 
    times 2 u8 0 
    endtimes 
endmacro

Once you have the t2b language down, the rest is just math. You can find the complete code on GitHub, but you’ll see it just computes 7 equal-sized regions and draws different colors as it runs through each pixel in a nested set of for loops. There’s also a one-pixel white border around the edges for no good reason.

When you want to run the code you can either specify the X and Y coordinates or take the 800×600 default:

./colorbar.sh 700 700 | t2b >outputfile.ppm

If you intercept the output before the t2b program, you’ll see the commands rolling out of the script. Here’s the default output to the ppm file:

Shades of Gray

The other script is a little different. The goal is to divide all the color values in a PPM file in half. If it were just binary data, that would be easy enough, but you need to skip the header so as not to corrupt it. That takes a little extra work. I used gawk (GNU awk) to make the work a little simpler.

The code expects output from xxd, which looks like this:

00000000: 5036 0a38 3030 2036 3030 0a32 3535 0aff  P6.800 600.255.. 
00000010: ffff ffff ffff ffff ffff ffff ffff ffff  ................ 
00000020: ffff ffff ffff ffff ffff ffff ffff ffff  ................ 
00000030: ffff ffff ffff ffff ffff ffff ffff ffff  ................ 
00000040: ffff ffff ffff ffff ffff ffff ffff ffff  ................ 

The address isn’t important to us. You can ask xxd to suppress it, but it is also easy to just skip it. The character representations to the right aren’t important either. The xxd program will ignore that when it rebuilds the binary. Here’s the code in awk (which is embedded in the shell script):

# need to find 4 white space fields
BEGIN  { noheader=4 }
    {
    lp=1
    }
    {
    split($0, chars, "")
# skip initial address
    while (chars[lp++]!=":");
    n=0;  # # of bytes read
# get two characters 
    while (n<16 && lp<length(chars)) { # heuristically two space characters out of xxd ends the hex dump line (ascii follows) if (chars[lp] ~ /[ \t\n\r]/) { if (chars[++lp] ~ /[ \t\n\r]/) { break; # no need to look at rest of line } } b=chars[lp++] chars[lp++]; n++; # if header then skip white space if (noheader>0) {
      if (b=="20" || b=="0a" || b=="0d" || b=="09") noheader--;
    }
    else {
    # if not header than /2
     bn=strtonum("0x" b)/2;
     bs=sprintf("%02x",bn);
     chars[lp-2]=substr(bs,1,1);
     chars[lp-1]=substr(bs,2,1);
    }
  }
# recombine array and print
  p=""
  for (i=1;i<=length(chars);i++) p=p chars[i];
  print p
  }

The awk code simply skips the address and then pulls up to 16 items from a line of data. The first task is to count whitespace characters to skip over the header. I made the assumption that there would not be runs of whitespace, although a more robust program would probably consume multiple spaces (easy to fix). After that, each byte gets divided and reassembled. This task is more character oriented and awk doesn’t handle characters well without a trick.

In particular, I used the split command to convert the current line into an array with each element containing a character. This includes any whitespace characters because I used an empty string as the split delimiter:

split($0, chars, "")

After processing the array — which isn’t hard to do — you can build a new string back like this:

p=""
for (i=1;i<=length(chars);i++) p=p chars[i];

The output file will feed back to xxd with the -r option and you are done:

xxd infile.ppm | ./half.sh | xxd -r >outfile.ppm

Two is the Loneliest

This is a great example of how the Unix philosophy makes it possible to build tools that are greater than the sum of their parts. A simple program changes a text-processing language like awk into a binary file manipulation language. Great. By the way, if your idea of manipulating binary is Intel hex or Motorola S records, be sure to check out the srec_cat and related software which can manipulate those, too.

Once you have a bunch of binary files, you might appreciate an online hex editor. By the way, a couple of years ago, I mentioned using od to process binary files in awk. That’s still legitimate, of course, but xxd allows you to go both ways, which is a lot more useful.





Read the whole story
tekvax
3 days ago
reply
Burlington, Ontario
Share this story
Delete

How Etak Paved the Way to Personal Navigation

1 Share

Our recent “Retrotechtacular” feature on an early 1970s dead-reckoning car navigation system stirred a memory of another pre-GPS solution for the question that had vexed the motoring public on road trips into unfamiliar areas for decades: “Where the heck are we?” In an age when the tattered remains of long-outdated paper roadmaps were often the best navigational aid a driver had, the dream of an in-dash scrolling map seemed like something Q would build for James Bond to destroy.

And yet, in the mid-1980s, just such a device was designed and made available to the public. Dubbed Etak, the system was simultaneously far ahead of its time and doomed to failure by the constellation of global positioning satellites being assembled overhead as it was being rolled out. Given the constraints it was operating under, Etak worked very well, and even managed to introduce some of the features of modern GPS that we take for granted, such as searching for services and businesses. Here’s a little bit about how the system came to be and how it worked.

Dead Reckoning Revived

Stan Honey showing off his Navigator Model 450 in 1986. Source: Honeynav.com

Etak’s story begins on a racing yacht off the coast of California in 1983. The yacht’s owner was serial Silicon Valey entrepreneur Nolan Bushnell, who along with Ted Dabney founded Atari and introduced the world to Pong. Along for the race was Stan Honey, an engineer and seasoned yachtsman signed on as navigator. Bushnell and Honey began brainstorming how to turn the navigation system Honey had built for the yacht into a land-based system. By the end of the race, Honey had the rough design for a system and a promise from Bushnell to bankroll its development.

From his navigation experience, Honey knew the challenges of dead-reckoning, the navigational technique that depends on keeping track of the distance and direction traveled from known positions, or fixes. Small errors in measuring either distance or direction of travel would accumulate until position couldn’t be determined with any reliability. Honey figured that correcting for dead reckoning errors in a car navigation would be easy because drivers are generally constrained to roads; correlating the current assumed position to the nearest roadway would provide a constant series of fixes and keep the system on track.

The Etak Navigator system.

The company Honey formed to commercialize this idea was called Etak, after a term used by aboriginal Polynesian mariners for moving navigational reference points. The task faced by Honey and his team was formidable. Not only was the computer hardware of the day limited, the map database needed for his “augmented dead reckoning” system didn’t exist. The computer problem would turn out to be less of an issue than the maps.

An 8088-based machine running at 4.77 MHz with 128 kB of RAM was stuffed into a shoebox-sized case for the vehicle’s trunk. The team found that traditional raster CRTs didn’t have the resolution needed to show map details, so they built a vector-based display like the one used on Atari’s Asteroids video game.

Keeping track of the vehicle’s direction and distance traveled was the job a pair of sensors. A fluxgate compass mounted on the vehicle’s rear windshield kept track of heading relative to magnetic north, while a pair of Hall-effect sensors mounted to the non-driven wheels counted off the miles. The wheel sensors also provided additional vector data by keeping track of the difference in rotational speed as the vehicle cornered.

Mass storage for the map data was another matter entirely. Hard drives of the day were expensive and bulky and singularly unsuited for life in the trunk of a car. CD-ROMs were years in the future, and floppy drives didn’t have the capacity needed to hold a usably large block of map data. After much experimentation, Honey’s team settled on a modified version of the standard audio cassette tape drive. A high-speed, random access drive using ruggedized cassettes was placed within reach of the driver, who would need to swap tapes when the limits of a particular map block were reached. Each cassette held 3.5 megabytes, which was pretty impressive at the time.

The Topology of Tapes

But exactly how to populate those tapes with maps was by far the thornier problem. Honey had an acquaintance working at the United States Census Bureau on digital mapping, Marvin White, and persuaded him to join the team. Together with teams of digitizers, they turned public domain paper maps into digital models, using scanners and custom digitizing software. They started with the area around the company’s Silicon Valley headquarters and expanded to major metropolitan areas around the U.S.

Storing the data on the tapes turned out to be a major challenge, and a breakthrough in storage technology. White’s knowledge of digital mapping and the mathematical field of topology led to a hierarchical approach to storing the map data, where the physical location of the data needed for the next turn or waypoint was most likely stored very close to the current tape position. This made Etak’s geocoding algorithms much more efficient than anything that had come before, and would be the key to the long-term success of the company after the GPS revolution.

Etak’s first product, the Navigator, came in two flavors. Model 450 had a small 4.5″ vector display and retailed for $1,395, while the Model 700 sported a spacious 7″ screen and a $200 premium. Each map cassette went for $35. Given that six tapes were needed just for the Bay Area map, the Etak was not exactly consumer-friendly. Still, it had a number of deep-pocketed early adopters, including Michael Jackson and former child star Gary Coleman. In the end, though, the consumer market didn’t create nearly the demand as the public service and commercial transportation markets, which were attracted to street-level detail provided by Etak’s geocoding system.

With something like 5,000 units sold by the late 80s, the days were numbered for Etak’s Navigator. As the U.S. government continued to build out the NAVSTAR global positioning system, it was clear that what was once a closely held military secret would one day be opened up to public use, and Etak saw the writing on the wall. They realized that the true value in their company was the digital map data, and in a move many companies would not have had the guts to take, they got out of the hardware business altogether and focused on their digital maps.

Certain readers will likely remember that early Mapquest maps bore a small Etak copyright, and with the eventual sale of the company to TomTom, their mapping technology lives on to this day. Etak Navigators may look quaint by today’s standards, but the technology Stan Honey developed unlocked digital mapping and set the stage for the GPS revolution that soon followed.





Read the whole story
tekvax
3 days ago
reply
Burlington, Ontario
Share this story
Delete

Free E-Book: Software Defined Radio for Engineers

1 Share

We really like when a vendor finds a great book on a topic — probably one they care about — and makes it available for free. Analog Devices does this regularly and one you should probably have a look at is Software Defined Radio for Engineers. The book goes for $100 or so on Amazon, and while a digital copy has pluses and minuses, it is hard to beat the $0 price.

The book by [Travis F. Collins], [Robin Getz], [Di Pu], and [Alexander M. Wyglinski] covers a range of topics in 11 chapters. There’s also a website with more information including video lectures and projects forthcoming that appear to use the Pluto SDR. We have a Pluto and have been meaning to write more about it including the hack to make it think it has a better RF chip inside. The hack may not result in meeting all the device specs, but it does work to increase the frequency range and bandwidth. However, the book isn’t tied to a specific piece of hardware.

Make no mistake, the book is a college-level textbook for engineers, so it isn’t going to go easy on the math. So if the equation below bugs you, this might not be the book you start with:

[Di Pu] and [Alexander Wyglinksi] have an older similar book, and it looks like the lecture videos are based on that book (see video below). The projects section on the website doesn’t appear to have any actual projects in it yet, although there are a couple of placeholders.

We have enjoyed Analog’s book selections in the past including The Scientist and Engineer’s Guide to Digital Signal Processing which is a classic. If you visit their library you’ll find lots of books along with classes and videos, too.

If you want something a bit less academic, there’s always [Ossmann’s] videos. Or if you’d rather just use an SDR, there are plenty of inexpensive options to choose from.





Read the whole story
tekvax
3 days ago
reply
Burlington, Ontario
Share this story
Delete
Next Page of Stories