Connecting to a SoftEther VPN server as client from OSX / Yosemite

If you’re here, you may already know that the OSX VPN client doesn’t work very well. Apparently Apple upgraded and nuked some of its dependencies, and the details are…complicated. Anyway, after spending several hours tailing system.log and digging through Apple support posts, this is all I have to say about that. This post (a Linux guide) helped immensely.

Luckily, SoftEther produces a command line utility:

vpncmd

Use it to establish a connection. Grab the full source from here, and run:

sudo ./configure && make && make install

Ok, now you need to start the vpn client:

sudo vpnclient

…And configure it:

sudo vpncmd localhost /CLIENT /CMD NicCreate tun0

If you get an error here, install tuntap so you can create virtual interfaces.

Now, set your account details:
sudo vpncmd localhost /CLIENT /CMD AccountCreate YOUR_MADE_UP_VPN_ALIAS

This will prompt you to enter details for account create. Be very careful here to enter the correct options.

Set your password (if you're using a certificate, you'd need to do that here instead):
sudo vpncmd localhost /CLIENT /CMD AccountPasswordSet YOUR_MADE_UP_VPN_ALIAS /PASSWORD:123456 /TYPE:standard

Bring your network interface online

sudo vpncmd localhost /CLIENT /CMD NicEnable tun0

Connect your account

sudo vpncmd localhost /CLIENT /CMD AccountConnect YOUR_MADE_UP_VPN_ALIAS

You can check the status via AccountStatusGet.

Once your connection has been established, you need to enable routing.

This will get you an IP from the VPN gateway:
sudo ipconfig set tap0 DHCP

This will add a path to your VPN gateway through your local router device:

sudo route delete default && route -n add VPN_IP_HERE/24 LOCAL_ROUTER_IP

Finally, this will make the VPN gateway the default, and you should have access to your entire VPN network:

sudo route add default DEFAULT_GATEWAY_OF_DEVICE_ON_TAP0

Here is a quick script for disconnecting (run as sudo):

#/bin/bash
vpncmd localhost /CLIENT /CMD AccountDisconnect YOUR_MADE_UP_VPN_ALIAS
vpnclient stop
route delete default
sudo route -n delete VPN_IP_HERE/24
sudo route add default LOCAL_ROUTER_IP_HERE
Posted in Networking | Leave a comment

OctoBadge – Display Github user badge showing open source activity

A few years ago there was a lot of controversy about Github being the new resume for developers and what that meant. In 2015, I think it’s almost universally accepted among developers that your code footprint is important. It’s very likely it’s going to be the first impression that you make on others in your profession when they look you up online.

There are many fine points which are still up for debate. How much should you curate your Github profile? Should you delete old projects or is a simple warning enough? What should you do when most of your work is private? What if the work others think is good (stars and forks) aren’t necessarily the projects you want to highlight? While these points and others are valid, it doesn’t change the fact that people really expect that if you’re a software developer, you should have a Github profile and you should have used it at some point in time.

You don’t need to have a project with hundreds of stars or hundreds of followers, but you should at least have some activity. I don’t care what academic or corporate rock you’ve been hiding under, but there’s no excuse for no participation at all in open source software. You should have at least raised issues, commented on something, sent in a few pull requests to improve the projects you use. Outside of the most contrived cases imaginable, I just cannot picture how you can completely avoid Github as a software developer. Even if you have use BitBucket, or Assembla, SourceForge, or whatever, so much software is on Github that you just can’t avoid it.

So, maybe you disagree with me. Cool, stop reading. If your head isn’t in the sand, I have something for you. It’s a Github User Badge which displays info about your projects, such as your number of followers, repositories, gists, project stars, and your contribution history. The larger version also shows the languages you write code in and where you hail from.




The library is also hosted on jsDelivr, so you can easily include it on your site:

JS

<script src="//cdn.jsdelivr.net/octobadge.github/0.0.1/badge.min.js" type="text/javascript"></script>



CSS

<link href="//cdn.jsdelivr.net/octobadge.github/0.0.1/badge.min.css" rel="stylesheet" type="text/css" />



…Then use the badge HTML:

<github-badge user="username_here" badge="octogeek"></github-badge>

Enjoy! Get your Github User Badge.

Posted in Developers, Github, JavaScript | Leave a comment

Docker / vagrant front end development without host syncing

I’ve been a vagrant user on OSX for a long time, and one of the common methods people use to allow modifying files inside the VM in their preferred development environment is through host syncing (synced folders).

Let me tell you from experience…it’s a shit option. If you aren’t using NFS as the file system (but rather ext or FAT), it’s painfully slow, especially on large projects. In addition, using NFS changes the networking options you have available, so it introduces other complications. I had resorted to simply running VIM within my VMs because I couldn’t take the slow load time and compromises on configuration.

When I started setting up a development environment with Docker, I decided I would get it right this time. Since under the hood Docker runs on top of VirtualBox (same as vagrant) on OSX, I wasn’t ready to go through the same pain again to get everything synced.

I elected instead for this solution:

- Pass in configuration via environment variables to set the Docker environment (and thus the container behavior) without image rebuilds.
- Download another repo that contains the code in the container setup.
- Create a local SSH mount for development.
- Install node dev tools on the container.
- Gulp runs watch, browserify, coffeeify, etc., on the container without looking at any shared volumes from my host.
- Nginx serves up the dist directory that gulp modifies as I make changes.

When running in development mode, the container adds a dev user, copies my local key to the container for that dev user, and starts the ssh server.

Here’s an example of how I’m loading my Docker container:

docker run -p 52022:22 —d -e key="$(cat ~/.ssh/id_rsa.pub)" IMAGE_ID

Here’s the magic bit that makes this work in the Dockerfile (I omitted the pieces that install dependencies, download a git repo, etc.):

ENTRYPOINT sh start.sh

…and the start.sh script:

if [ "$project_env" = "production" ]
then
  echo "Prod!"
else
  mkdir /project/.ssh
  echo $key >> /project/.ssh/authorized_keys
  service ssh start
  adduser --quiet --disabled-password -shell /bin/bash --home /deviant-ui --gecos "User" dev
  echo "dev:dev" | chpasswd
  cd /project
  git pull origin master
  gulp
  chown -R dev /deviant-ui
  echo "Dev."
fi

nginx -g "daemon off;"

I can now mount to my local filesystem using sshfs, which is installable on OSX via brew:

mnt_ip=$(boot2docker ip) && sshfs -p 52022 dev@$mnt_ip:/project /tmp/test

Unmounting is as easy as:

sudo umount -f /tmp/test

Now, I can work on my local machine, changes are synced instantly to my dev container where my gulp process can handle file watches and change the dist folder that nginx serves from, and I can push changes to my application code without modifying my dev docker image at all.

I’m not finished setting this up yet, but I just wanted to share these things I’ve found as most of the other examples online seem to take the hard path of fighting with VirtualBox. I’ll publish a complete example when it’s ready. To the people who say a Docker container should only do one thing, my answer is that in production, it will only do one thing! In development, I value my time and sanity more than rigid philosophy.

Posted in DevOps, Docker | Leave a comment

Custom docker images in a private docker registry backed by s3

Say you want to user docker to containerize your infrastructure. Step 1 is making your infrastructure composable through docker images. Here’s a quick tutorial on how you’d do that.

First, create the registry server:

docker run \
-e SETTINGS_FLAVOR=s3 \
-e AWS_BUCKET=YOUR_REGISTRY \
-e STORAGE_PATH=/\
-e AWS_KEY=YOUR_KEY \
-e AWS_SECRET=YOUR_SECRET \
-e SEARCH_BACKEND=sqlalchemy \
-p 5000:5000 \
registry

Pull an existing container:

docker pull containername

Run it and make some changes (the t flag is to open a teletype and the i flag is to go into interactive mode):

docker run -t -i containername /bin/bash

Make a change in the container and then turn it into an image:

docker commit -m "touched file" -a "Calvin Froedge" ID_YOU_CHANGED new_image_name

Running docker images will show you your new image. Grab its id.

Tag the ID:

docker tag THE_ID localhost:5000/test

Push:

docker tag THE_ID localhost:5000/test

Congrats! You (and anyone else you give access to with AWS key & secret) can start this registry on their machine and download a Docker image.

docker pull localhost:5000/test

Posted in DevOps, Docker | Comments closed

How To Generate Salesforce API Credentials

I usually wouldn’t write about such a mundane subject, but given the past several soul sucking hours, I decided I would put in my humanitarian service for the week by telling others how to accomplish what should be a totally trivial task, that just isn’t for Salesforce.com: generating API credentials.

Given that I’m totally new to Salesforce integrations, I’m sure that someone will come find this post at some point and lambast me for my missteps and miscomprehension. That’s fine, because this post isn’t written for you. It’s written for the thousands of application developers out there who have customers who have unfortunately chosen to use this overpriced behemoth.

Also, big WARNING, it’s apparent from the many stack exchange questions and salesforce.com instruction links that Salesforce changes their shit all the time and doesn’t make any effort to clean up their documentation or ensure the old flow still works. So if you’re reading this around March 2015, then it should be good, otherwise, if the following instructions just don’t match what you see, it’s not you, it’s Salesforce.

For those of you who just need to generate a set of API credentials for their own data automation and synchronization needs:

Step 1:

Create an account. You can create a (free) developer account at developer.salesforce.com


Step 2:

Ignore all the landing pages and getting started crap. It’s an endless marketing loop.


Step 3:

Click the “Setup” link


Step 4:

In the lefthand toolbar, under “Create”, click “Apps”


Step 5:

Under “Connected Apps” click “New”


Step 6:

Fill out the form. Important fields are marked below (you can leave the rest blank)


Step 7:

Be advised that Salesforce has crappy availability.


Step 8:

Press continue. You finally have your key (client id) and secret (client secret).


Step 9:

But wait! You’re not done yet.

Make sure IP restrictions are disabled as well, and make sure that Permitted Users is set to “All users may self-authorize.”

If you’re concerned about disabling security, don’t be for now, you just want to get this working for now so you can make API calls. Tighten permissions once you have everything working, one at a time, so you can figure out what setting is giving you authentication errors.


Step 10:

Celebrate! This curl call should succeed:

curl -v https://login.salesforce.com/services/oauth2/token -d “grant_type=password” -d “client_id=YOUR_CLIENT_ID_FROM_STEP_8″ -d “client_secret=YOUR_CLIENT_SECRET_FROM_STEP_8″ -d “username=user@wherever.com” -d “password=foo@bar.com”

Notes:

- You shouldn’t be doing password authorization if you’re building a multi-tenant app, where users need to authorize their own application. Use the Oauth2 workflow for that.

Posted in Internet, Programming | 6 Comments

Review – ZPacks ArcBlast

Though I love my Osprey Aether 70L pack for its toughness and comfort, in preparing for a hike like the Appalachian Trail, weight was my number one consideration. This inevitably led me to ZPacks, a brand created by a former thru-hiker, Joe Valesko, who was known on the trail for his ultralight homemade gear. He also thru-hiked the PCT and the AT.

I picked my ArcBlast up from Joe’s bargain bin for the ultra reasonable price of $250. The pack was a return, and had been customized for someone else, but fit my needs perfectly. The pack is constructed with a small aluminum supports, cuben fiber material, and some exceptional stitch work. The pack weighed in at just 18 ounces, and that was before I removed the aluminum or the hip supports.

With a bit of creativity, the pack can be very comfortable (one thru hiker suggested that I use my sleeping pad as a back pad, which worked very well). It bears weight surprisingly well for its design and I felt the pack was acceptably comfortable even when I was pushing the advertised weight limit of 30 pounds.

The stitching that holds on the shoulder straps eventually began to wear, but not until the end of my thru-hike. This is fairly remarkable considering how little material is actually there and how rough I was on my pack. After running out of food a few times, I developed the habit of carrying far more food than I needed, so sometimes I was pushing the weight limit, and I spent a lot of days putting in miles at a jog, sometimes even a brisk running pace it handled the abuse well.

The mesh on the back of the pack is also made to be ultralight and can be ripped if snagged or worn by a sharpish edge, but by no means rips easily. The pockets on either side of the pack are perfectly played, and the cube fiber, in addition to being ridiculously light, also happenns to be both tough and waterproof. I could hike in pouring rain, even go through chest deep water, without getting anything in my pack wet. Most hikers struggled with pack covers – no need with the Arc Blast.

Given the unheard of weight, the relative durability and the remarkable cuben fiber material, I highly recommend you consider a ZPacks backpack for your long distance hiking arsenal.

Posted in Backpacking, Outdoor Gear | Comments closed

Hotswap + Nesh. A more awesome Node.JS development experience.

Ever since working on some minor Clojure projects and getting some experience with the leiningen and LightTable, I’ve been expecting a little bit more from the programming environments I work in.

Anyway, as I’ve begun doing significantly more work using Node in the past few months, I decided to really dig into the available toolset. I’ve discovered a few things that have really made life much more pleasant.

Hotswap

Hot code reloads are pretty necessary for catching errors quickly and being able to interact with your code as you write it. The hotswap module helps with this nicely. It’s not perfect – changes in modules required by the modules you hotload aren’t recognized instantly, so unless you’re only working on a few modules you’ll probably have to restart your REPL session at some point, but it’s a big improvement over the standard REPL experience. All you have to do to get rolling is this:

require('hotswap');

and in your module…

module.change_code = 1;

Nesh

Nesh is a drop-in replacement for the standard Node REPL. It features code inspection, multiple language support (you can write CoffeeScript), convenience functions and a plugin system.

You can also start it as part of a node process, just like the standard REPL.

One thing that really annoys me about the Node REPL (that Nesh shares) which I haven’t figured out a good solution to yet is when (for example) a syntax error is thrown, the Node process bails and the REPL closes. I’m sure a solution is possible, I just haven’t tracked one down. A work-around, and otherwise generally useful practice, however, is to do some preprocessing to prepare my REPL environment. That way, I don’t need to load too much of what I was working on when I restart the REPL. This is possible with both the standard REPL (by modifying the REPL context) and with Nesh (by modifying the global scope). Here’s an example:

Say I want to load a specific REPL experience (for working in one part of my application vs another). I have a script which is included anytime I want to have a REPL:

var hotswap = require('hotswap'),
      nesh = require('nesh');

opts = {
    welcome: 'Welcome!',
    prompt: '> '
};

// Load user configuration
nesh.config.load();

module.exports = {
    start: function(callback){
        // Start the REPL
        nesh.start(opts, function (err) {
            if(callback) callback();
            if (err) {
                nesh.log.error(err);
            }
        });
    }
}

That bit loads hotswap and calls a callback when the REPL starts. The callback is what I use to inject what I want into the scope:

var repl     = require("./repl")
      fs         = require('fs');

repl.start(function(){
    config  = require('../config');
    lib1 = require(config.libdir+'/lib1');
    lib2 = require(config.libdir+'/lib2');
});

Voila! A very pleasant Node development experience – an interactive REPL with code hot swapping and configuration dirty work taken care of.

Posted in JavaScript, NodeJS, Programming | Comments closed

An open letter to Baxter State Park

To whom it may concern,

On July 22nd, 2014, I entered Baxter State Park at around 9PM with the intent of camping at the Katahdin Stream Campsite.

Upon arriving at the clearing with the bathroom facilities, ranger station and employee housing near the daicey pond campsite, I took note of the sign informing me that I would not be able to check into the camping site after hours. Specifically, the sign said I would not be admitted. It was after midnight and I had walked nearly fifty miles that day. As a thru-hiker, I had become very accustomed to often reaching my destinations very late. This was the first indication I had seen that I would not be able to sleep at the Katahdin Stream Campsite. There was no indication that I could continue to the campsite and pay in the morning. The sign simply stated that I would not be admitted. I knew that “stealth camping,” or unauthorized camping, was not permitted, so I chose to sleep, very uncomfortably, on a gravel pile directly in front of the employee station, so when I woke up in the morning I could promptly pay the $10 fee that would have been required if I had arrived before the Katahdin Stream Campsite closed.

When I woke up and was packing my tent, I indeed met an employee driving towards the Daicey pond site in her truck. I believe her name was Rita. Rita refused my $10 and reported me to the law enforcement ranger. She asked me for identification and I provided it, willingly.

The Baxter State Park Rules and Regulations book states that camping is limited to certain sites to preserve flora and fauna. I slept on a gravel pile next to a gravel road in front of a building. Presumably, this gravel road and building destroyed a lot of fauna in their construction. Had I any intention of dishonestly avoiding the $10 fee, I would have simply stealth camped in the woods, and would have never been discovered. I would have never given your employee who was not physically capable of pursuing me my identification. Yet, I chose to limit my ecological impact and sleep on the gravel pile, and to attempt to pay the $10 fee despite having slept on a gravel pile.

After my encounter with the first park employee, I talked to the ranger at the base station. He acted like he was sympathetic to my story and told me the employee I talked to earlier “liked her job too much,” and was making a big deal out of nothing. He said she had been on the radio making a big deal about it. He said he didn’t care, but it would probably be a good idea for me to check in with the law enforcement ranger.

I hiked up to the end of the trail at Katahdin and upon my return, went to talk to the ranger at the base station about how to contact the law enforcement ranger. He informed me that the law enforcement ranger was leading a convoy of construction cranes, and that to link up with him, I would need to run a half mile to the road and hop in his truck. I did.

The officer, Isaac Needell, summarily proceeded to tell me that I broke the law and that he would be writing me a ticket for it. I brought up the fact that the park’s charter, and the intent of the law, was to preserve the ecology of the park, and what I had done had satisfied the spirit of those requirements. I told him that I believed that this had nothing to do with preserving biodiversity, and everything to do with revenue.

I do not deny that I broke the law by sleeping most miserably on your gravel pile. Your policy communicates to people that it is better to dishonestly avoid the law, than to respect the spirit of a law and approach any breaches honestly. I voluntarily slept in front of your employee station on a pile of gravel that was left over from park development projects that most assuredly destroyed the flora and fauna your charter claims it protects. I voluntarily confronted your agents, provided my identification, and entered officer Needell’s car. The fact that I was penalized for my honesty and conscientious behavior, and that my money was refused in favor of the opportunity to extort me, speaks volumes about the true intent of Baxter State Park – commercialization of natural resources and revenue through extortion.

Best regards,
Calvin Froedge

Posted in Appalachian Trail | Comments closed

Dreams and Feedback Loops

I had a dream last night about Chris Granger.  He worked on Microsoft’s Visual Studio team, and then went on to build LightTable, a next generation IDE written in Clojure.  I’ve known about LightTable for a couple of years, but have only begun using it recently.  The takeaway from the dream was, “Get to know your tools, learn as much as you can.”  I went to Chris’ blog this morning and started reading posts.  I came across this post, which talks about connecting to what you create.

Chris referenced a talk by Bret Victor, a kind of renaissance man, who was discussing feedback loops during the creative process.  He showed some screenshots from a design tool he built which allows you to visualize your work as you complete it.  For example, you write an algorithm, and you can see, as you write the algorithm, how it manipulates data.  This is where Chris Granger got his inspiration for LightTable, and after seeing the talk, went and coded a prototype in Clojure.

Bret talks about working by guiding principles, choosing tools that give you immediate feedback, and nurturing your ideas.  It’s funny how inspiration hits.  Chris saw this video and then started coding.  Fast forward two years and he raised a few hundred thousand on KickStarter and built an awesome product.  I didn’t get just how powerful it was until watching this video, and now have a reason to take the time to master the tools while continuing on the project I’m working on.

Dreams are funny.

Posted in Ideas, Programming, Software Design | Comments closed

Why Abe Lincoln Would be Homeless Today

“Let no man live who is wiser or better or more famous or even handsomer than the mass. Cut them all down to a level: all slaves, all ciphers, all nobodies. All equals.”

- CS Lewis, The Screwtape Letters

We have, in American culture, most curious ideas about equal opportunity.  Where once we valued inventive genius, we now tend towards something which is much more static, predictable, and stale.  Fifty years ago, around 5% of Americans held college degrees, and fewer than 40% had graduated high school.  Today, more than 30% graduate college and more than 90% from high school.  It’s easy to make the assumption that such an improvement in the educational attainment of the common man means that the average American is becoming more intelligent.

However, this just isn’t what’s happening.  We’ve been using standardized tests to evaluate college preparedness for nearly a hundred years, and those scores (SAT, namely) haven’t improved.  In fact, they’ve dropped 40 points since 1967.  Interestingly, scores haven’t dropped because the tests got harder, as the average man became more educated.  The opposite is in fact true.  Scores have declined, in spite of “re-scalings” and simplifications in the test format. Meanwhile, higher education is becoming exponentially more expensive, outpacing general inflation by 500%, and graduates are getting less from it.  The problem is that we have, as a culture, become more accustomed to jumping through arbitrary hoops than actually becoming competent at something.  Instead of being interested in really learning things, we set our sights on becoming certified to do it.  Abraham Lincoln attended less than a year of school throughout his lifetime, yet taught himself to read and write.  He took the bar exam, he passed, and at the height of his career he was handling more cases than any other attorney in Illinois.  He didn’t come from a rich family, he didn’t attend a prestigious school – he practically didn’t attend school at all.  He studied on his own, he took the test and he passed.  People hired him because he charged reasonable rates, spoke eloquently, and knew the law well enough to win cases.  Today, it’s not possible to “Read the Law” as Lincoln did.  Our legal code is vast, but understanding it certainly doesn’t require formal education – it simply requires actually reading the laws, something so very few of us, including those hired to interpret the laws, do.

The list of professions requiring institutional licensure is vast.  Lawyers and Doctors spend decades in school, beginning with childhood.  Accountants, engineers, massage therapists, psychologists, speech therapists, even hair dressers and barbers, all must complete long tenures in education before they can begin practicing in their chosen profession.  This doesn’t lead, as conventional wisdom would suggest, to better professionals.  The institutional educational model is diametrically opposed to the style of learning that some of the most profound scholars, professionals and leaders in human history have followed.  Under the institutional model, whether or not someone can do it is irrelevant.  Whether or not someone aces the bar exam is irrelevant.  The real requirement is that they spend years of their life and thousands of dollars at a state approved institution. If the intended output of our educational apparatus really is people who are capable of performing certain societal functions, as demonstrated by their performance on a final written exam, why do we not simply allow people to just take the  test and spend the best years of their life doing something other than required drudgery?  There’s a term in our prison system called “make work,” which describes work that is created simply to keep someone busy.  I can’t help but believe that the majority of the things we engage in are simply that – make work.

We live in a world abound with free educational opportunities.  Universities around the world publish lectures, notes and research on their public facing websites.  Between Khan Academy, Udemy, Udacity, Coursera and Wikipedia, you can easily find every educational resource you would need for both a world-class K-12 education, and a competitive university education.  The only thing you need is a bit of self-discipline.  In a free market economy, one expects that when better, cheaper alternatives to goods and services enter a market, all of the costs for those goods and services will be reduced.  The older, less effective, more expensive goods and services will at first have their prices lowered to remain competitive, and at some point, they will lose their economic viability entirely and disappear from the market forever.  This isn’t happening in education, because education in the United States is not a free market.  Economic vitality is tied directly to professional licensure, which is controlled by the educational system.  Economic livelihood in the United States has become a walled garden.  Someone could learn everything there is to know about medicine, but to practice medicine without going to jail, you have to stay in school till an institution says you can leave.  We all pay for this – not just the aspiring doctor.  Because the AMA tightly controls the process for becoming a doctor, and the amount of doctors who can enter the market, we all pay more for medical services.  We debate about how to pay for national healthcare schemes, yet we don’t consider that the costs are so high in the first place because the design of our institutions makes them so.

I was drawn to software development because it’s difficult to regulate, and thus easier to practice without needing to do make work.  I’ve spent thousands and thousands of hours learning, and as a result, I can make software.  Because I can make software, people hire me to make software.  It’s significant that, being one of the only professional fields which does not require licensure to practice professionally, software development has made such a significant impact to every other field.  Electronic medical records, magnetic resonance imaging, synthetic biology, electronic diagnosis, etc., were all made possible by software engineering.  It doesn’t matter what industry you examine – architecture, aviation, psychology – every single one of them have made huge advances in recent years only because of the advent of computers and those who program them.  It’s ironic that these advances were largely enabled by unlicensed people.  Yes, universities were involved and played a considerable role.  MIT, Stanford, Berkeley and Michigan State all played huge roles in the development of the internet and of many advancements in programming tools, languages, and operating systems.  However, the contributions of the unlicensed amateurs were incredible.  Gates was educated by having access to computers at an early age.  He dropped out of Harvard and his operating system enabled cheap desktop computing for a generation.  Jobs dropped out of Reed and brought us Toy Story and the iPhone.  Larry Ellison dropped out of high school, and went on to build one of the first relational database systems, power half of the world’s ERP systems and purchase an island in Hawaii.  John Carmack revolutionized the video game industry with his physics engines after dropping out of public school, then went on to found Armadillo Aerospace, serving as the lead engineer.  He landed a spacecraft on the moon and won a $500k prize.  Nasa spent $25B in 1969 dollars to put a man on the moon, while the Soviets spent $4.5B on their Luna program.    Going back to the very birth of electricity, Benjamin Franklin, the guy who appears on the $100 bill, had a famous disdain for institutions.  He is remembered for such minor advancements as electricity, refrigeration and modern currency.

The point is that our society is totally reliant on progress made outside of our institutions, yet we chain ourselves to these institutions.  We require K-12 completion to be admitted to public universities on scholarship (GED students are not eligible for state merit scholarships in most states, regardless of standardized test scores).  We require degrees and licensure for so many professions and further study programs.  We’ve decided that people must do one thing at a certain age, and another thing at some other age.

Today, we stand at the precipice of systematic collapse.  Global food systems and ecology are on the brinke.  Bees are dying, rain forests are being cleared at an unprecedented rate.  Our currency is worth less and our lives cost more.  Half the world lives on less than a dollar a day.  Deserts are expanding as soil levels are depleted.  And yet…an unassuming, sweet girl who just wants to get married and pop out a few kids has to stay in school for over a decade just so she can cut hair.  A young Isaac Newton has to sit in class with that girl until he reaches the level where he can take an advanced class with only slightly less dumb classmates.  As Lewis stated, “The bright pupil thus remains democratically fettered to his own age group throughout his school career, and a boy who would be capable of tackling Aeschylus or Dante sits listening to his coeval’s attempts to spell out A CAT SAT ON A MAT.”

Sure, there are exceptions.  There are ten year old students who go to college, graduate and spend the rest of their lives doing something meaningful, the miserable make work out of the way.  This exception requires a village of support, however.  It requires not only that the pupil be extraordinary, which is surprisingly common, but that the parents are willing to take action to challenge the student outside of what is conventional, that administrators of educational and professional institutions are flexible, and both willing and able to bend the rules for unusual circumstances.  A system which can accommodate someone with exceptional potential is, sadly, far less ordinary than the prodigy herself.  In this regard, the responsibility lies more on society to allow for greatness, than for individual greatness to occur.  What does all that mean?  For starters, that if a fifth grader can test out of high school, that the local school district damn well better encourage her to.  That  the parents better cast away all fears about social acceptance.  That an individual with no formal educational background, yet who has developed a substantial body of intellectual work, should be evaluated on her proven merits rather than her hoop jumping abilities.  The “real” world – the world that produces the things that allow the rest of us to survive – already works this way.  The trick is to make our institutional world – academia and government – mirror this rather than impede it.

Posted in Abstract, Philosophy, Politics, Programming | Comments closed