Technical Difficulties from on Top of the Mountain
  Checking those spinning hard drives.
From Windows 7 command prompt:

wmic:root\cli>diskdrive get status

Well, ok then.

From how to geek

  The past is so strange, its almost hard to imagine.
My co-workers were complaining about printf() being named poorly, because in these modern times, the function doesn't cause a "printer" to do anything. Sadly I immediately knew what the problem was--perspective, and not because I'm super good at studying history so I can learn from it. I'm just old enough that I remember when things changed.

I was there when you still dialed a phone, because the phone had a "dial" on it. I was there when to send a letter to two people, you had to type the original on to two pieces of paper with a piece of carbon paper in between in order to make a "carbon copy". (Later shortened to just cc.) And I was there when a computer would "print" your program output onto paper as you ran it.

Computers are amazing, and the technology that's gone into them over the last 75 years has progressed at such a pace, that without being there, its almost hard to believe how limited, difficult, and just different things used to be. Just look at what was considered state of the art in smartphones 10 years ago:

So wind things back a little further, to the beginning of the interactive computer experience, and you'll find a bunch of these:

This was a Teletype, a machine for encoding typed characters either onto a paper tape (don't worry, they're long gone), or with the dawn of time-share, directly sending characters to a mainframe. The output would come back over the line and print on the giant roll of paper, thus a key command in many languages (including BASIC, C, PERL and so many others) was PRINT, or in the case of C, print with formatting ... printf().

The core of the teleprinter was a marvel of engineering, but again maybe not what you might expect, it was almost entirely mechanical:

In fact, the teleprinter wasn't even originally designed for computers, instead two of these devices were connected together between remote locations, and an operator would type a message one one, while the same message would be output onto the second, like telegraph but simpler to operate. But like many other technologies, teletypes were co-opted into computing, and shaped it greatly. Given its slow speed for both sending and receiving characters, Unix creators choose to keep basic commands as short as possible, and we still remember a host of two digits commands for navigating and operating on our filesystems (cd, ls, rm, df, mv, cp, you get the idea.)

So the next time you go to write a print statement, just remember, once apon a time, that's exactly what those first programmers were doing.


  Its the little things.
For some reason my new car only has two places you can open it with a key, despite having five doors.

Other than providing a struggling Hollywood writer with an interesting plot point for the 8,000th episode of NCIS and saving Toyota $2 in costs on a $60,000 car; I can't really think of a good excuse for the passenger door not to have a place to open it with a key.


  Its the little things.
For some reason my new car only has two places you can open it with a key, despite having five doors.

Other than providing a struggling Hollywood writer with an interesting plot point for the 8,000th episode of NCIS and saving Toyota $2 in costs on a $60,000 car; I can't really think of a good excuse for the passenger door not to have a place to open it with a key.


My three older kids are past this point, but with the twins now three, such phrases have begun to bounce around in my head again...

sweet and sour
about an hour


Oh well, that's parenthood I guess.

  The power in thinking about hard problems.
I am an engineer by trade, not a scientist, because I have a certain amount of impatience with thinking for thinking's sake.  But as I have spent great amounts of time solving simple problems, I have more and more appreciation for big thinking.  It can be a great place to go steal ideas to use tomorrow.

feynman One big thinker was Feynman.  He made many contributions to Physics, finding ways to solve problems that were intractable with traditional tools, but he was also curious about a great many things and was able to predict the future just by wondering about what would happen if you took things to their extremes.  He had predicted molecular machines (or MEMS) by just thinking about a serious of 10:1 reduction levers that just kept getting smaller and smaller.  But it turned out he also had been pondering the future of computing and its intersection with physics way back when.

The thread for this started from watching the quantum computing talk (LIQUi) from LangNext 2014: (channel9)

There was a reference to a list of publications at the end of the talk with a link: Search Results for Dave Wecker, and skimming through the paper 'Improving Quantum Algorithms for Quantum Chemistry', I noticed that the first paper was by Feynman: "Simulating physics with computers".  This actually was a keynote talk at a conference in 1981, but back not everything got recorded for uploading to youtube, so you'll have to be satisfied with a scan of the original paper (or you can pay $40 to Springer Publishing to get a copy of the transcript for which they paid nothing.)

Basically, almost 40 years ago, Feynman did the thought experiment about whether you could simulate quantum physics--and to be complete he considered both classical discrete computers as well as then non-existent quantum computers.  Short answer?  Classical computers would never be able to tackle big enough problems in a scalable way, but if engineers ever figured out how to build usable quantum computers for the physicists to use... well, we might just be in for more interesting times.

  Amazing progress.
It was about eight years ago, as I was travelling back and forth for work, I decided to splurge and buy a top of the line memory card for my "smart" phone. This meant shelling out $100 for the biggest most massive SD card you could buy -- a whole 2GB. That was pretty awesome. I could get all kinds of music, or several videos on there (yes the Palm treo had the wonderful TCPMP media player which would play standard def AVI files scrunched down on the 320x320 screen), and I was all set for plane rides and any other occasion of idleness.

If anyone doesn't think technology is racing along at breakneck speeds, they just have to check flash densities vs what week it is,

Still about $100, not counting the fake flash card from Foxx*

That's over a trillion bits, unless the flash vendors have decided to skimp and only give us 6.75 bits per byte, kind of like the hard drive vendors redefining K=1000 instead of K=1024. Lets hope they're honest, but in any case, that's still a mind blowing number of bits.

Heck, I remember getting my hands on one of these back when I was first playing with a soldering iron,

That's an 8 by 8 array of BITS, that's right 256 whole bits of memory. Awesome. Ok, that was a long time ago. Never mind, I'm going to start copying my entire library of kids videos onto this trillion bit spec which will probably take all night.

* Yes, in this age of Kickstarters and cheap knockoffs, its pretty easy to buy some second hand flash chips, reprogram a SD controller to lie and say you have 128GB, but start failing after filling up the paltry 16GB or whatever size they back it up with. Figure half your customers will be too lazy to return it, and you have yourself a money making machine. So, any listing on Amazon with no reviews, and a price too good to be true, is probably too good to be true.


  Staying inside the lines.
When it comes to dealing with binary data, especially binary packets from public interfaces, you can't take any chances. C was built a long time ago when doing things safe was very expensive, and so they chose speed. For a very long time I built systems where speed was essential as well (in computer graphics an extra instruction can be multiplied by a billion), but I eventually moved into more general computing problems, and at the same time computers got thousands of times faster.

In a modern system, random memory access is now the killer. The CPU has cycles to burn. Some of the lessons I've learned on performance recently taught the exact opposite of what was true twenty years ago.

To that end, a modern style of dealing with containers has to be bounded. You need to know where you are, and what your limits are. There's some issue with what operator++() should do when you reach those limits, but the one answer for sure is that it can't just go stepping past and start stomping on whatever's next. To that end, my current libraries have the following concepts for both buffers and containers:

A Range is a beginning and and end.

These never own the storage, they just indicate where it is. This can be used for both the available space to write to, and for the used space containing data within a buffer or container.

A cursor is a range + an iterator.

This is where I've gone a bit past all the other work in C++ containers, but I think this is important. Modern iterators (at least the fun ones), are bidirectional or random access. That means the beginning is as important to keep track of as the end. And copying a cursor should not narrow you to the space you had left, but should allow the copy to head backwards to the front just as easily as progressing to the end.

This also gives us a great data structure for those algorithms like std::rotate() that operate on three iterators.

There have been a lot of people banging around on this for some time. Andrei Alexandrescu wrote a great paper On Iteration that had lots of stuff to say about his implementation of containers and interators for D.

Labels: ,

Life in the middle of nowhere, remote programming to try and support it, startups, children, and some tinkering when I get a chance.

January 2004 / February 2004 / March 2004 / April 2004 / May 2004 / June 2004 / July 2004 / August 2004 / September 2004 / October 2004 / November 2004 / December 2004 / January 2005 / February 2005 / March 2005 / April 2005 / May 2005 / June 2005 / July 2005 / August 2005 / September 2005 / October 2005 / November 2005 / December 2005 / January 2006 / February 2006 / March 2006 / April 2006 / May 2006 / June 2006 / July 2006 / August 2006 / September 2006 / October 2006 / November 2006 / December 2006 / January 2007 / February 2007 / March 2007 / April 2007 / June 2007 / July 2007 / August 2007 / September 2007 / October 2007 / November 2007 / December 2007 / January 2008 / May 2008 / June 2008 / August 2008 / February 2009 / August 2009 / February 2010 / February 2011 / March 2011 / October 2011 / March 2012 / July 2013 / August 2013 / September 2013 / October 2013 / November 2013 / December 2013 / December 2014 / February 2015 / March 2015 / July 2016 / September 2016 / December 2016 /

Paul Graham's Essays
You may not want to write in Lisp, but his advise on software, life and business is always worth listening to.
How to save the world
Dave Pollard working on changing the world .. one partially baked idea at a time.
Eric Snowdeal IV - born 15 weeks too soon, now living a normal baby life.
Land and Hold Short
The life of a pilot.

The best of?
Jan '04
The second best villain of all times.

Feb '04
Oops I dropped by satellite.
New Jets create excitement in the air.
The audience is not listening.

Mar '04
Neat chemicals you don't want to mess with.
The Lack of Practise Effect

Apr '04
Scramjets take to the air
Doing dangerous things in the fire.
The Real Way to get a job

May '04
Checking out cool tools (with the kids)
A master geek (Ink Tank flashback)
How to play with your kids

Powered by Blogger