and the lab is up

Posted in Geekfest on August 13th, 2016 by juan

Ok so I got it working on Sunday, but I’ve had a long week and didn’t have a chance to update. The C6100 is up and running. I’ve moved it into the DCF and surprisingly, the noise on it is actually so little, I can not hear it through the door. That was one of my biggest concerns. Long story short, the process was involved, but mostly because the SSL certs on the management IPs are so outdated.

So all that said, this is what I ended up with:

I’ve never had that much compute, memory, storage, or anything. That was a very, very large data center not long ago.

Cool. Now on to other cool stuff.

Oh… and a couple of things to note in case someone is actually reading this.

The back of the C6100 gets **hot**. I noticed in the move from my office into DCF that the USB thumb drives I’m using for boot were very hot. I’m concerned that it’s going to take them beyond supported limits, so I bought a set of little pig tails to have them off the motherboard.

Also – this thing runs relatively low power, but it is sucking about 660 watts being mostly idle. I’m going to have to buy another power supply and another UPS to make sure that I survive more than 5 minutes of outage.

Tags: , , , , ,

update on the DCF upgrade

Posted in Geekfest on August 7th, 2016 by juan

Got the C6100 on Friday (8/5). The system is as described by the seller on eBay, but … FFS … it only included one power supply! Tried getting the thing going and ran into immediate snags. First, each node only has two USB ports. My intent is to boot this guy off of USB, but I need a keyboard, and a USB key to do the install and a USB drive to use for boot. One too many. So I tried using a hub to attach the key and the keyboard. Couldn’t get it to boot. Moved the key to a dedicated port and had the USB “drive” on the hub with the intent to move the “drive” to a dedicated port once the install was complete.

No Go.

So, next step was to try to use the remote management to mount a virtual CD. I tried connecting with a current release of Chrome from my production iMac. Well, wouldn’t you know it – you need Java to do that. Don’t want to install that on my main iMac, but I was going to do that for the sake of the project. But before I did that, I wanted to see if Firefox would run it better. Firefox won’t connect to the port because of a security warning on the old HTTPS certs on the box. Same thing with Safari. That made me fire up a Windows 8.1 VM. Same issues. And, oh, during all of that my internet connection goes out … for 3 hours. AHHHH. Well undeterred, I find an old XP VM. That too didn’t have Java loaded.

End of Line.

Back to the project now (Sunday 8/7/16).

First thing first, though, I have to finish the PFsense build and roll out. Cross your fingers. Lots of work.

Sophia! Please give me 100 licenses for internal use. I promise it’s not commercial work! Also note that I’ve single handedly driven multiple deals for you because I brag about your stuff to just about all the customers I visit. And I visit lots of customers. Lots and lots.

Another fresh start

Posted in Geekfest on August 7th, 2016 by juan

So, like usual, I’m sitting on a plane and have little to do. Figured that this is a good time to start writing for my blog again. It’s been much ignored recently, but a new update to Ulysses just came out and it supports posting directly to WordPress sites.

There’s been much that has changed since I last posted. Much has changed in my personal life, but much has also changed in my nerdery. My home network now has over 45(!) things with IP addresses on them. This is forcing me to make a change that I did not want to do. For years I’ve been running the Home Edition of Sophos UTM. I really couldn’t be any more pleased with the functionality of it, but for whatever reason, it is limited to only 50 internal IPs. Now, granted, when I first got the software, that seemed like a ridiculous number. But, as with all such things (640K of RAM – who’ll ever use that much?), the time has come for me to move on. My first attempt was to use the next generation firewall from Sophos. Well that failed 30 minutes into me trying to use it. Many of my devices have static IP addresses handed out to them via the DHCP server. When I was taking those address into the new firewall, it TIMED OUT ON ME AFTER I PUT THEM ALL IN. Yup 30 minutes of laboriously entering MAC addresses, IP addresses, and host names – the damned thing failed. You see, Sophos didn’t develop a migration tool from the old UTM to the new “goodness”. Time to move on.

So – this weekend, I’m going to pfsense. It’s not as slick. It doesn’t do all the UTM stuff. It’s clearly written by folks that are nerds like me and not professional UI dudes (no disrespect intended). I’m going to miss some of the features of Sophos, but I have to move on.

Why you ask? Well – that’s the good part of this story. I just bought a new (well to me), Dell C6100 four node blade system for my home lab. It’s going to bring 32 cores, 192GB of RAM, and lots of other things to the home DCF (Data Control Facility for you new readers). That gives me enough juice to run most of the “hard core” stuff my vendors are trying to shove into Datalink. It’ll be fun having a really nice home cluster again.

But … to get that guy on my network, I’m going to have to doll out at least 12 more IPs just for the hardware. Imagine home many more I’m going to have to hand out once I start firing that guy up full of VMs.

Yeah – I’m a nerd and have first world problems – but that’s how I learn and make a living.

More to come…

Tags:

Old computers were fast (back then)

Posted in Geekfest, Musings on April 9th, 2012 by juan

Speed

I have many vices. One of them is collecting old computers. To me, those are the 8 bit systems that were popular in the late 70’s and early 80’s. This last week, I was able to get a nice collection of Atari 800 and 400 machines from a local craigslist entry. Those are fun machines and bring me back to learning programming for the first time. My first computers was a TRS–80 Model 1, Level 1. However, the first machine I had access to that had “real graphics” was an Atari 800 in the computer lab at my middle school. I loved playing with the graphics and remember learning all sorts of tricks to make it faster.

I got to playing with the Atari’s and typed in some Basic programs just to see the thing do its thing. I remember them being fast back in the day. Well here I am 30 some odd years later with a computer that would seem as something out of a far future world to my little self then (4 processors! 4GB of RAM! 256GB of Solid State Disk! Wireless networking to the world at 50Mbit! Megapixel display with 32bits of color depth/pixel! On my lap! At it weighs less than 2 1/2 LBS! Seriously? That’s can’t possibly be! Oh – and that’s just my laptop. Don’t forget I have a “real” computer too.) Those ATARI’s were not fast.

Being the geek that I am, I had to see how much faster we have it today. I poked around the net for a bit and found an implementation of the Sieve of Eratosthenes on this site. I entered it into my Atari and it did come in at just around 5 1/2 minutes. I had to test it against my current computers, so I downloaded a copy of Chipmunk Basic. It seemed like a fair test to compare an interpreted basic to an interpreted basic. Here’s the basic version I wrote up:

10 dim flag(8191)
15 for a = 1 to 1000
20 count = 0
30 for i = 1 to 8191
40 flag(i)=1
50 next i
60 for i = 0 to 8190
70 if flag(i+1)=0 then goto 150
80 prime = i+i+3
90 k=i+prime
100 if k > 8190 then goto 140
110 flag(k+1)=0
120 k=k+prime
130 goto 100
140 count = count + 1
150 next i
160 rem print count
170 next a
180 print a;"iterations"

The big difference between my version and the ATARI version is that I had to run my version for 1000 iterations for me to get meaningful timings. The results?

[juan:~]$ time basic t.b
1001 iterations
basic t.b  7.63s user 0.00s system 99% cpu 7.639 total

That works out to be that my laptop is about 43,000 times faster than that ATARI. On one core. Let’s see what it’s like on all cores:

[juan:~]$ for i in {1..4}
for> do
for> time basic t.b &
for> done
[2] 30012
[3] 30013
[4] 30015
[5] 30017
[juan:~]$ 1001 iterations
basic t.b  18.80s user 0.03s system 98% cpu 19.029 total
[2]    done       time basic t.b
[juan:~]$ 1001 iterations
basic t.b  18.76s user 0.03s system 98% cpu 19.069 total
[3]    done       time basic t.b
[juan:~]$ 1001 iterations
basic t.b  18.80s user 0.02s system 98% cpu 19.069 total
[5]  + done       time basic t.b
[juan:~]$ 1001 iterations
basic t.b  18.79s user 0.03s system 98% cpu 19.097 total
[4]  + done       time basic t.b

Or roughly 70,000 times faster.

But wait. That site that had the listing for the Basic version also had one for one in Action! (which was a compiled language for ATARI’s). That version ran in about 1.5 seconds according to the Analog article ( I don’t have the Action! package to verify). Well I couldn’t not measure that too. So I wrote a C version of the Sieve. It’s a very dumb version intended to match the basic one as closely as possible:


#include <stdio.h>

int sieve()
{
  int flag[8192];
  int i,count,k,prime;

  for(i=0;i<8192;i++) {
    flag[i]=1;
  }

  count=0;

  for(i=0;i<8190;i++) {
    if (flag[i]) {
      prime=i+i+3;
      k=i+prime;
      while(k<=8190) {
        flag[k]=0;
        k+=prime;
      }
      count++;
    }
  }

  return count;
}

int main()
{
  int c,i;

  for (i=0;i<=100000;i++)
    c=sieve();

  printf("found %d, %d times\n",c,i-1);
}
[/c]

It turns out that this compiled version is so fast that I had to run is 100,000 times to get measurable results:

[juan:~]$ gcc -O4 t.c -o t
[juan:~]$ time ./t
found 1899, 100000 times
./t  4.17s user 0.00s system 99% cpu 4.177 total

And to do it on all the cores:

[juan:~]$ for i in {1..4}
do
time ./t &
done
[2] 30449
[3] 30451
[4] 30452
[5] 30454
[juan:~]$ found 1899, 100000 times
./t  7.44s user 0.01s system 95% cpu 7.832 total
[5]  + exit 25    time ./t
[juan:~]$ found 1899, 100000 times
./t  7.51s user 0.01s system 95% cpu 7.850 total
[3]  - exit 25    time ./t
[juan:~]$ found 1899, 100000 times
./t  7.46s user 0.01s system 94% cpu 7.891 total
[2]  - exit 25    time ./t
[juan:~]$ found 1899, 100000 times
./t  7.49s user 0.01s system 94% cpu 7.897 total
[4]  + exit 25    time ./t

Or roughly about 80,000 times faster.

All that on my laptop while I’m sitting in bed. Running on batteries.

The future is cool.

On Lion and recovery

Posted in Fanboy, Geekfest on September 20th, 2011 by juan

I’m 30K feet plus up in the air right now. Doing some work with my MBA. Can’t tell you again how cool it is to be able to work while I’m sitting on a plane. With a real computer.

So – I’m typing away at a blog post for work. As usual, I have my retinue of open apps doing their thing: chrome (I’ll talk about this vs safari some other time), iterm2 (you know about that right?), mail, and preview. I’m switching between preview and marsedit to write the blog post and all of a sudden my keyboard doesn’t respond. Even worse, I can’t command-tab to switch to another app to see if I can fix it. Switching over to iterm2 doesn’t let me type on the command line (which has always been my failsafe way of fixing things). I even try the shut the lid, go to sleep, and wake trick. That usually fixes keyboardy things. Nothing. I start panicking a little. I’ve been working on the work blog for a while and have much mental sausage already spent on it. In desperation, I close the lid, open and try to switch to another user (you do have another user just in case right?). That doesn’t work either. Can’t type in the login window!

The only thing that works is the shutdown button in the wake from sleep screen. I hit it and I’m asked for a login again. OH NO! But luckily, I can type in my admin user and password, at which point my MBA reboots. Because of the cool SSD thing, the reboot is very quick. With trepidation I login to my regular user and …

and …

It’s all back! All of it. My apps are where they are normally hidden (spaces), and Marsedit has my blog post open and current to the very last character I typed. Chrome recognized that I killed it and all my open tabs are brought back. Mail is happily doing it’s thing. iTerm2 is there waiting, flashing the cursor, beginning me to vi or something.

I’ve never had a full panic shutdown restart experience not cost me any work. Never. Lion fixed that. Somehow. Magic.

Now the question -what happened? Why did I lock up? I have a suspect. The only thing that didn’t come back is preview. It complained that the file I was looking at was not available because I didn’t have permissions to it. Hmmm… That file was in my dropbox and was just recently upgraded to one of the beta releases. I’m pointing the finger at that.

so I remember

Posted in Commentary, Geekfest, Musings on March 10th, 2011 by juan

One of my clear recollections of my early computer usage was the day that I bought my first hard drive. At 5 MEGA BYTES it seemed a luxury beyond all imagining. It only cost me $3,000.00. In 1980.

Had the same feeling in the mid 80’s when I upgraded my Amiga to 2MB of RAM (remember the sidekick?) and a 40 MB hard drive. It seemed like RAM beyond measure. Storage beyond possible utilization.

In the early 90’s my work gave me a computer with a super high rez screen, UNIX, 4 MB of RAM, and 1GB of hard disk, and a SPARC Based UNIX operating system with INTERNETS. Mere PC’s were useless to me. Imagine the _power_ of my configuration.

In the early 2000’s (naught’s?), my laptop came with dozens and dozens of GB’s of hard disk space, and a Gigabyte of RAM. It used windows, but that’s before OSX became stable.

By the mid 2000’s my laptop had a 17″ screen with super high rez screen, 120 GB’s of hard disk, 1.5 GB of RAM, and UNIXes. Welcome to the vortex of Steve. The power was mind boggling.

In the mid 2010’s my laptop still had a 17″ screen, but hi-rez to a new level, 8GB of RAM, and 500GB of HD. The processor had two cores each of which is nothing less than a super computer.

By the late 2010’s I got the first desktop I’ve used consistently since the early 90’s UNIX workstations. It has a 27″ inch screen, 8 cores of super duper computer horsepower, more RAM than I have used yet (no swap), it’s connected to 20+ TB of storage in my home gigabit network. My DCF has officially exceeded a [LOC](http://libraryofcongress.gov).

My current laptop has 128GB of storage, 4GB of RAM, and two cores.

Say what?

What just happened? When did it become a feature for less to be more?

Simple: we have too much juice. All around. What we __can get__ and what we __use__ are worlds apart now.

Interesting.

Tags: , , ,

faster, must do faster

Posted in Commentary, Geekfest on March 6th, 2011 by juan

## Act without doing

One of the big pleasured I found since switching to the mac is [Quicksilver](http://blacktree.com). For years it was _the_ way for me to launch, open, do anything. If you’ve never used it, it’s definitely worth you looking at. Unfortunately, the developer Alcor, has moved on to a lucrative job at google and left it’s future to the tubes. When leopard came out, QS broke (supposedly – it’s fixed, but I’m fickle). That sucked for me. I’ve been looking for a replacement since. There’s a ton of products that kind of do the same thing: [launchbar](http://www.obdev.at), Google’s [quicksearchbox](http://google.com/quicksearchbox), and [butler](http://manytricks.com/butler/). But – I’ve just started using [alfred](http://www.alfredapp.com/). I like it because it’s FAST, lightweight (for me with full indexing of everything, it only consumes 12.5MB of real RAM), and it has a clipboard manager (when you buy the powerpack).

The developer is very active on twitter and on their support forums. For $15 bucks (was on special at [AppSumo](http://appsumo.com)). It’s well worth the bucks.

Tags: ,

Post PC world

Posted in Commentary, Geekfest, Musings on March 6th, 2011 by juan

## It’s about who uses it

So I’ve been ranting about how, for me, the iPad is not the device for content creation. After further reflection, that needs to be revised. I should change my tone because it could be for others. At the iPad 2 introduction The Steve made a point of mentioning that this is the intersection of technology and liberal arts. That’s it. That’s who can use the pad for _creation_. My world is emphatically not liberal arts. My passions all revolve around technology. My work is _all_ technology. Interestingly, my content creation, although putatively creative, is all technology driven. The closest I get to liberal arts is … media consumption. Aha.

Now, the truly creative folks – the artists/authors/painters – they are typically not technology driven. They want something to capture their creative expression in an intuitive way. They could care less about the megaseekels and geegasquirtz. They care that it turns on, they point, and it does. iPad.

I get it.

But not for me.

Tags: , , ,

Was Sun right?

Posted in Commentary, Geekfest on March 2nd, 2011 by juan

## My move to the cloud
It turns out that my move to the cloud might be what everyone else is going to be doing. Over the last week or so, I’ve changed the way I look at my computing devices. For a very long time (well just about forever), I’ve really only had one computer that I used as “the computer.” That’s despite having a ton of hardware laying around doing things in my basement and my desk. Those “other” computers were utility devices: my vmware farm for email/http/etc services, my mac mini for desktop/utility services, my netapp/open solaris boxes for file storage. My laptop was still the primary holder of what I considered critical functionality and data. If I lost or broke my laptop, I’d be in a world of hurt. Well, not really, backups are a good thing. I’d be in a world of “recover, waste time waiting, and then do work.”

With my acquisition of a truly powerful desktop (iMac 27″ core i7 – woot!), I needed to make a change. I’d prefer to work on my desktop when I can, and then go mobile when necessary – and do it seamlessly. The email part was easy, or should have been easy. I’ll post on that later. What was not easy was the data. In retrospect, it should have been easy, but I made it hard for myself. In my ultimate fantasy world, I would have liked a complete copy of all data, application state, and application configuration transferred from one machine to another. That way I could literally get up from my desk, move to the couch with the laptop, and just continue. Sure – I could have done that with remote desktop of some sort, but that’s not really an option when I’m on a plane or in a hotel with crappy internet. In the ideal world, I would only be sacrificing compute performance and screen real-estate for mobility. To get there, I played with a ton of sync options, both commercial and open source: rsync, goodsync, chronosync, etc. Unfortunately, none of them really give you the state of applications, and in the case of chronosync – your computers have to be physically close (as in the same network) to effectively keep them in sync.

My path to the cloud became clearer with the acquisition of the MBA 11″. Even though it’s a top of the line 128GB SSD model, it simply does not have the capacity to hold all of the data that I kept on my previous core machine – my 17″ MBP. That meant sacrifice. Out of sacrifice came clarity. Before this, I had not fully committed to the iMac being “the computer”. That’s because I wanted full access to everything while on the road. Well, the 11″ is going to be the on-the-road machine. I can’t have full access on it. The decision was simple: the iMac became the ‘puter. All of my iTunes and iPhoto stuff left the MBP and moved over to the iMac. With that move, I loose the ability to sync my i-devices on the road, but that’s ok. I’ve not been fanatical about that anyways.

All that was left was the problem of having my core important data available to me at all times on all computers. Enter Dropbox. Finally, I purchased a paid account on the dropbox service, and sync’d all of my core data to the cloud. My work flow had to be changed a little bit based on where I placed my stuff. Instead of ~/Documents/xxx, I now place it in ~/Dropbox/xxx. That service now automatically sync’s all data to the cloud and back to my devices – even my iDevices if needed.

### the network is not the computer
Sun’s vision was to make all services cloud based – including compute. The only thing you would need is an access terminal and your data and applications would bet there. The access device really needed only enough horsepower to run authentication, the network, and the display. VMware’s view and the rest of the VDI gang are headed down this same path. For much of the enterprise needs (think call centers and things like that), this is __the right way__. But – for me that means I have to be on the network. I’m not always on a reliable network or even a fast one. I have to have local compute and storage to do what I need to do. In all honesty, I think a very large segment of the non-home, non-drone corporate worker user base is in the same boat. The problem has been exactly the path that I went through: how to make the data and the compute always available.

### where I ended up at
After much mashing of teeth, and angst, I ended up here:

– The iMac is my central compute platform and also acts as the master sync for all data, including the iDevices
– Core data is hosted on Dropbox and automatically sync’d to all my devices, mobile or not (great value for $100/year)
– My MacBook’s have essentially become interchangeable. Use the Air for when I need light weight and simplicity (most of the time). Use the MBP when I’m traveling and need a desktop replacement (large screen, compute horsepower, etc.)
– The idevices (iPhone, iPad) have become more useful because I can use the data from Dropbox to do quick work on recent data

To accomplish this, I had to make one major workflow change: Close all apps at the end of the day. Because OSX is so reliable about sleep mode, I’ve gotten into the habit of just closing my laptop and moving on. Many times I don’t even save my work. Really. It is that good. Well, the sync thing requires that I do not do that. It’ll take a little while to break a 7 year old habit, but that can be done.

Anyways, if you think about what I have gotten to, it is this: My compute devices are interchangeable and I can select which one I use based on practical location requirements (i.e. am I sitting at my desk?, on a plane?, at a customer’s?, etc.). As a side effect of this, my data is now also safe. It’s on the cloud, and multiple devices. Loosing any device due to theft, negligence, failure, etc – means little other than replacement of the device. The important stuff, my work and data, are simply re-instated. Pretty damn cool.

Tags: , , ,

My public and private cloud experience

Posted in Commentary, Geekfest, OOTT on February 27th, 2011 by juan

## So – I lost some images…
Today I had to do some errands, one of which included me going to a car wash. It was more than just a simple wash – it was the first wash of my wife’s 1.5 year old car. Yeah – that’s not good, but it is what it is. Naturally, that wash took some time \(and money\). During the wait, I fired up my MacBook 11.6″ \(weee!!!\). I looked and saw that there was no open wifi around. But not to fear, I whipped out my trusty clear hotspot. In a manner of a few minutes, I’m settled and I start doing today’s stuff.

Today’s stuff happened to be an update my daughter’s lacrosse website ([rhsgirlslax](http://rhsgirlslax.com/ “RHS Lax”)). I had to put some of the sponsor images on the website via a pretty cool WordPress plugin called [Ad Squares Widget](http://www.primothemes.com/ “Ad Squares”). It felt good to be happily resizing images, redoing some of the logos so they would fit, etc on a really small form factor computer. Definitely something I could not have done (well easily) on an iPad. Anywho, about halfway through the mini project, I noticed that the plugin had been updated. WordPress does a nice job of notifying you about this. So – not thinking twice about it, I told wordpress to go ahead and update the plugin. While it was doing that, I finished fixing the code for the plugin and then added all the URL’s to the adds into the widget. But, when I went to reload the site, none of my images worked. A serious WTF moment later and some digging showed me what should have been pretty evident all along. The example code from the plugin places the images in the folder with the plugin. I did that. Well – when you update the plugin, it doesn’t update the directories and files – it replaces them! BAM! All my images were gone.

No big deal, right? Just upload them to the server again and poof. Well, that wasn’t so easy. The 11″ MBA just came into duty and all I had was the images that I was working on that day (got them from my email). Well – the good news is that the iMac that had those images was at home. A quick ssh to my home server and a hop over to the iMac and I was there. Then I ran into a simple problem: how do I efficiently transfer those files back to my 11″. That’s when Dropbox did a double whami AHA! on me. A quick “mv rhsimages ~/Dropbox” later they were in the cloud and seconds later on the 11″. Wee!!

It gets better. I serve the website from my home server via my DSL line that has a relatively meager 750Kb/s uplink. It works well for most things, but it sure isn’t enough to quickly serve something with tons of images. Well Dropbox, has this public folder thing. If you want, you can generate a URL to any file in that folder. So rather than copying those images back to my server, I left them on Dropbox and grabbed the public URL. I used that in the Ad Squares page and now my daughter’s site is being served a zillion times faster.

So – what is this then? I like to think of it as my cloud migration experiment. I’m doing a blend of private (my vmware farm with the web server) and public (Dropbox) clouds to do something better. How about that.

Tags: , ,