Favorite OSX Applications

Here are a few of my favorite Mac applications. These are mostly generic applications useful to most people using a Mac. I do of course have plenty of task-specific and domain-specific applications I use, but those are a topic for a later post.

Clipboard (cut-and-paste) history

Jumpcut

There are many applications that do this, or do this as part of doing all kinds of other application-launching tasks, but this simple utility is the one I’ve used forever. I cannot imagine how people use Macs (or any computer) without a clipboard history.

Online backup

Arq

This is a straightforward and reliable backup tool that let’s use use backends like Amazon and Google for data storage.
Sure, I have a nice little Time Capsule for local backups, but repeat after me “It’s not backed up until it’s offsite”. And using something like Amazon’s Glacier storage means it can be economical to back up large data sets.

Retina Mac Display Resolution Setting

Eye-Friendly

Yes, the Settings app lets you set display resolutions, but with nowhere near the level of control that this utility provides.

Disk Usage

DaisyDisk

If you want to see what is using up space on your disk, this tool is a beautiful way to do it!

Disk Cloning

Carbon Copy Cloner

So there are really three pillars that make up a comprehensive backup strategy.

  1. Local and very frequent – Time Capsule
  2. Offsite – Arq
  3. Local and immediate restoration – Carbon Copy Cloner

Carbon Copy Cloner is the tool to use for making a bootable backup, or when migrating to a new hard drive.
If you are in a situation where waiting to restore from a Time Capsule, or offline, would result in excessive downtime, a CCC backup is the fastest way to get things working again.

Monitor Everything

iStat Menus

When you want to go many many steps beyond Apple’s Activity Monitor, iStat Menus is there.
Incredible detail and monitoring of your network connection, voltages, temperatures, memory, CPU and more.
Maybe it’s just my love of gauges and dials, but I really like knowing whats going on inside the shiny box.

Other Tools

In no particular order, here are a few other tools that I find useful, but not everyone will:

  • Simplenote by Automattic
  • The Unarchiver by Dag Agren
  • Transmit by Panic
  • VLC

Embedded Development – The Hobby Edition

Ok, so here are pictures of embedded controllers and such. Just a snapshot of my hobby embedded environment.

Not shown is my workbench, which gets set up and taken down on the kitchen table as needed. (apartment living!)
This includes soldering station, digital oscilloscope, and tools.

The table top near the desk:

Spark Core, RaspBerryPi, Arduino and so forth. Plus an old-time iPod!

The RaspberryPi runs the home automation system, and is connected to an XBee used for communication with various peripherals. The Arduino is one of those, in development. The Spark Core is my development core, with others already in use around the house.

IMG 6078

Drawer #1:

Arduino stuff, XBee modules, and various shields and parts from Adafruit.

IMG 6079

Drawer #2:

RaspberryPi and Spark Core. Power supplies, and a fairly ancient Radio Shack multimeter.

IMG 6080

E-Books Update for Early 2014

Just wanted to give an update on my slow but steady transition to E-Books – a process that took on new urgency when we went from suburban life to apartment renters in downtown San Diego.

As someone who has spent a lifetime collecting, reading, and cherishing physical books, this has been a challenging journey. As the technology gets better, it has become easier though.

With my latest acquisition of the new Nook Glowlight (the one released at the end of 2013, not to be confused with the Nook SimpleTouch Glowlight which was its predecessor) this transition has become even easier to accept.

In the beginning, E-Books were read either on a Mac or PC, and the mobile hardware was pretty clunky. Today we have Retina iPads and very nice E-Ink readers. My latest Nook is E-Ink, and although my original Nook was also, this one is miles ahead.

It’s actually getting to the point where in many ways I’m preferring reading on my Nook versus a physical hardcover. Sure, this was always the case when portability was the main factor. “Hey, I can take ten novels with me on the plane, and it takes up less space and weight than one hardcover!”

But now I’m finding that the technology has improved to the point where even when the weight and size isn’t an issue, the experience is as good or better.

What factors are bringing about this change? Here they are, in no particular order.

  • Incredible reductions in size and weight. My new Nook weighs about the same as a moderate paperback, and about half of what a really think paperback weighs.
  • Higher resolution screen. It’s getting pretty close (~210dpi) to printed resolution. Close enough that it’s not obvious that it’s an electronic page rather than a paper one.
  • Frontlight for reading in the dark. The new Glowlight is pretty good, with only some mild darkening at the very top of the page. Fantastic for low-light situations. And since it’s front-light versus back-light like an iPad etc. it should have less of the sleep-impacting effects that have been reported for LCD displays.
  • No page-flip “flash” that previous generations of E-Ink typical had.

And then there’s the other factors that have been present for a while: The ability to carry hundreds of books, weeks of battery life, and (if reading a book purchased from Barnes & Noble) the ability to read those books on my iPhone or iPad with my place in the book being synced between the devices. Oh, and not having to try and hold a book open when eating lunch etc. is pretty nice as well.

I’m not ready to abandon my physical books just yet – nothing is going to replace that experience. But for my general reading, I think I’m about at the point where I no longer see switching to E-Books as a necessary compromise to accommodate our new less-burdened lifestyle, but as a pretty nice way to enjoy reading.

Nest acquisition by Google

On Facebook, a friend asked about my take on the Nest acquisition by Google that was recently announced.

(I’ve given Nest thermostats and Smoke Alarms as gifts, and have a Smoke Alarm myself. I’d have a thermostat if we weren’t apartment dwellers at this time.)

Here’s the answer I gave:

Unhappy. While I’m aware of all the positives (e.g. deep pockets to allow expansion of the Nest product line and protection against some of the patent issues they are threatened with), I really don’t trust Google anymore.

Nest has given assurances that the customer data won’t be shared, but that policy is subject to change down the road. Google made similar assurances regarding YouTube and those have weakened over time.

I’m trying not to be too paranoid, and I’m not suggesting that anyone throw away their Nest devices just yet, but I’m definitely concerned.

Really wish it had been Apple. Sure, they want all my money but they get it in an upfront fashion, not by advertising or selling my data.

Oh, I should add that on the bright side, this may kick off the overarching privacy discussions that need to take place as we head further down the “Internet of Things” road.

In an ideal world, the result of the acquisition will be Google realizing that they need to make some serious promises/safeguards regarding privacy for Nest if they expect it to remain a viable product line.

In the non-ideal world they just go ahead and be evil, counting on the sales to people who don’t know or don’t care about the privacy concerns. Sigh.

iPad mini – Retina musings

When the iPad mini was released, I bought one to see just how that size would work. I really liked it, to the point where I regretted getting just the 16G wifi- only model. I also missed the Retina display of my full-sized iPad. My plan was to grab a Retina mini when they release today, but since Apple made the full-sized iPad so much lighter and a bit smaller (but same screen size), my decision became a lot harder.

And then, to further complicate things, I happened to see the latest Nook e-reader at Barnes & Noble last night, and was impressed by both it’s extremely light weight, its small size, and the nice illuminated screen. (I’ve always been a fan of e-ink)

So now I’m contemplating just sticking with my full-sized 3rd gen iPad, perhaps getting rid of the iPad mini, and grabbing a new e-ink reader to fill the “just reading a book” niche.

(Yes, I still have my original Nook, but it’s about the size and weight of the iPad mini, so it hasn’t gotten much use lately.)

Interesting Websites May-2013

This is a collection of interesting and useful websites for iOS development, as well as some more general technical stuff.


General

Apiary

If you deal with REST APIs, this is some pretty neat stuff:

    http://apiary.io

Quandl

Tons of numerical datasets. Not sure what I’m going to do with this, but I feel like I should come up with something:

    http://www.quandl.com

Xcode Tools

XCode Package Manager

An easy way to manage adding packages that modify and improve Xcode.

    http://mneorr.github.io/Alcatraz/

Snippet Editing

This is a nice tool to let you edit your Xcode snippets.

    http://cocoaholic.com/snippet_edit/

Appledoc

If you are looking to document some Objective-C classes or frameworks you’ve created, this is a very easy way to generate documentation that looks like Apple’s. This tool can also create documents that will integrate nicely with Xcode as well.

    http://gentlebytes.com/appledoc/

Objective-C

Objective-C Features

Wondering when a certain feature was available in Objective-C? Wonder no more!

    http://developer.apple.com/library/ios/#releasenotes/ObjectiveC/ObjCAvailabilityIndex/index.html

Classes, Frameworks, and Libraries

Networking

This makes networking so much easier it has to be see to be believed. There are many large and serious iOS apps and products that make use of this class!

    https://github.com/AFNetworking/AFNetworking

Core Data

    https://github.com/magicalpanda/MagicalRecord

Handy Classes

This is a collection of very handy-looking classes for iOS development.

    http://sstoolk.it

Sliding side-panels

    https://github.com/ktatroe/sidepanel-ios

JKFiltering

Filter arrays with blocks

    https://github.com/jklaiho/JKLFiltering

Numeric Entry

    https://github.com/benzado/HSNumericField

iPad Mini

The iPad mini is an experiment – wifi only and just 16G, so not planned to replace my “big” iPad.

May put the Nook into the museum though (or limited to use on the beach where sun makes LCDs a poor choice).

The mini is thinner and lighter than the Nook and just slightly larger width and height. And iBooks is way nicer than the somewhat limited Nook software.

The new Lightning connector is very slick – easy to connect and feels a little less fragile than the complicated 30-pin connector of yore. Yes, it’s a pain to have to (slowly) transition all of my cables and devices, but the 30-pin connector had a good run.

The display is certainly less crisp than the Retina display on my full-size iPad, but it’s not bad, just not as good. If you’ve never spent time using a Retina iPad, it will look great, as the dpi is actually better than that of the non-Retina iPads.

Cross-Compiling for BeagleBone Using a Linux VM on the Mac

BeagleBone Cross Compiling using OSX on the Mac (using Linux)

Although it may seem more complicated, I decided the best way to cross-compile from my Mac would be to do it from a Linux system.

I would of course love to have my BeagleBone cross-development environment running directly on OSX, but that’s an effort for another day – my goal here was to be able to cross-compile for the BeagleBone.

Yes, in theory this should be possible on a Mac since it’s Unix-based, but it appears to me that may be more pain. More pain than just setting up an Ubuntu system using VMWare Fusion anyway. This is quite easy these days – both VMWare and Parallels make it almost effortless. There are many many resources on how to do this, so I’m not going to cover that here.

With Ubuntu 12.04 running happily in a VMWare VM on my MacBook Pro, away I went.

So to just cross-compile, it doesn’t appear that you really need the OpenEmbedded and Angstrom kernel stuff, so I went with just the Angstrom pre-built toolchain available at:
http://www.angstrom-distribution.org/toolchains/

I used angstrom-2011.03-i686-linux-armv7a-linux-gnueabi-toolchain.tar.bz2
This is the 32-bit version, which matches my Ubuntu VM setup. There is a 64-bit version there as well, if you are running a 64-bit version of Linux on your host system.

Step-by-Step

On the Cross-development Host

Extract the pre-built angstrom toolchain:

    $ cd /
    $ sudo tar -xf <path to the angstrom toolchain>

Which creates the new directories /usr/local/angstrom and /var/lib/opkg.

Run the environment setup script, which puts the new tools in the path as well as sets up several other environment variables:

    $ . /usr/local/angstrom/arm/environment-setup

Create a simple “Hello, world” program using whatever editor you prefer. I named mine “hello.c”.

    #include <stdio.h>

    int main()
    {
        printf( "Hello, worldn" );
    }

And finally, do a test build.

    $ arm-angstrom-linux-gnueabi-gcc hello.c -o hello

This should result in an executable file named hello. (This is a Csource file, so I used gcc, but you can also change the gcc at the end of the command to g++ to compile C++ source code.

On the BeagleBone

If you transfer the executable binary output file hello to your BeagleBone (e.g. via FTP), you should be able to run it and see the text “Hello, world” printed out in the BeagleBone terminal.

    $ ./hello
    Hello, world

Resources

This guide is based on bits of information gleaned from a variety of sources, including:

Linux To Go
electrons on radio
Trey Weaver’s Blog
gpio.kaltpost.de

Thanks to everyone who has blogged about their experiences with the BeagleBone!

Writing a BeagleBone SD Card Image From the Mac Revisited

Since I last dealt with writing an SD card image to update the BeagleBone software, some things have changed. So, here’s an updated guide to the process.

Download the Image

The latest images are available here: http://beagleboard.org/latest-images

The one I downloaded was named: Angstrom-Cloud9-IDE-GNOME-eglibc-ipk-v2012.05-beaglebone-2012.11.22.img.xz

Uncompress the Image

The latest BeagleBone images are not compressed in a format that tar or unzip can deal with. Fortunately there is a free application The Unarchiver that can. It’s available here: The Unrchiver.

Run this application on the .xz image file to get a (much larger) file that ends in .img. This is the file you’ll write to the SD card.

Unmount the SD Card

To write the image to the SD card, it first needs to be unmounted.

    $ diskutil unmount /volumes/YourCardNameHere

Find the Device Name

This step is critical.

Using the wrong device name can destroy the data on your computer’s hard drive, so be very very sure to get the device name correct in the following steps!

This can be found in a couple of ways. You can use the “Disk Utility” application, or the command line diskutil or df commands.

Diskutil

From the command line:

    $ diskutil list

df

    $ df

This may show you the partition, something like disk7s2. You want to entire SD card, not any partitions so drop the s2 part.

Write the Image

Then, write the image to your SD card. Note that the diskXXX should be the actual disk device assigned when the SD card is connected, and Angstrom-XXX should be the name of the actual card image you downloaded and extracted previously.

Again be sure to use the right device name for the SD card in this step!

Don’t be surprised if this takes a while – on my system it took about 37 minutes, and there are no “in progress” indications of any sort, so be patient! (note that this time is more a function of how big the image is – ~3.4G – and the speed of your micro-SD card then how fast your computer is)

    $  dd if=Angstrom-XXX.img of=/dev/diskXXX bs=4096

If you aren’t logged in as root, you may need to use sudo, in which case the command is:

    $  sudo dd if=Angstrom-XXX.img of=/dev/diskXXX bs=4096

The Dish and My Non-Love of Ads

So Andrew Sullivan of The Dish is taking his blog independent, and using a no-ad pay model. This has been getting a lot of buzz on the internet.

After taking a look at his existing blog and finding it interesting, I paid my $19.99 for a year. Hopefully I’ll enjoy the blog!

Why did I take this rather impulsive leap and support a blog I haven’t ever really read?
Simple – I hate the ad model. Repeat after me “If you don’t pay for it, you’re not the customer, you are the product.”

I’m putting my money where my mouth is by supporting blogs like this, iTunes, Netflix etc.
(This is also why Hulu can go pound sand – I’ll only watch ads in TV shows – especially ones I pay for – only if there is no alternative. Long live Netflix & iTunes!)

Once you start to get your entertainment without ads, it’s hard to go back.

It really bugs me that even though the New York Times has a pay model, you still get ads on the web. Why not have an option to eliminate them for a higher cost? I don’t get it. I understand that printing and selectively delivering two versions of a printed newspaper isn’t practical. But online? It’s a Simple Matter of Software. Could be done easily.

My time has value. I’ll gladly pay what something is worth to watch, listen, or read it.