DevDay 2015, Inspiration, and a quick look back…

So far this year, which is obviously nowhere near finished yet, I have had some amazing experiences. From .NET Fringe, Polyglot 2015, Progressive .NET Tutorials 2015, to Dev Day 2015 and more. I decided to add a little bit more of a personal note in this blog entry because of inspiration I just got from Michał Śliwoń (@mihcall) on his Dev Day 2015 Aftermath write up.

Just as Michał writes,

“Inspiration is like a spark. It can be one brilliant presentation at the conference, one sentence at some session, one hallway conversation with another attendee and I’m excited, coming back with a head full of new ideas. Every conference has this little spark”

and I completely agree. At .NET Fringe I got back into a few things on the .NET CLR stack, namely F# and a little toying around with Akka .NET and micro-services using those technologies. I also had a hand in organizing and the origins of the conference, which I wrote about. At Polyglot 2015 my desire increased to become more familiar with and comfortable with functional programming languages. At the Progressive .NET Tutorials I was again inspired to dive deeper into functional languages and take a look more closely at everything from Weave and other container and virtualization based systems.

Thrashing Code News

One things that this led me to, is to start putting together a list of people who are interested in these types of conferences. I’m talking about the really down to earth, nitty gritty, get into the weeds of the technology, and meet the people building and using that technology everyday conferences. This list, you can sign up for here and do read the article just below the sign up page, as this is NOT some spam list. I’ll be putting in real effort and time to put together good content when the list officially kicks off! I will blog about, and of course get that first email out about Thrashing Code News in the coming months.

Again at Dev Day I was also inspired by many people and got to meet many people. Which leads me to the number one thing that makes these conferences absolutely great. It’s all about the people who attend.

The People

I got to meet Rob Conery (@robconery / http://rob.conery.io/). We hung out, had beers, talked shop, talked surfing, talked tech and training screencast, discussed future bad ass conferences (again, sign up to my list and I’ll keep you abreast of any mischievious conference Rob & I dive into) and tons more. It was seriously kick ass to meet Rob, especially after not getting a chance to at what must have been a gazillion conferences he and I have both been at before!

I finally met Christian Heilmann (@codepo8) who I also think we must have both been at a gazillion of the same conferences and somehow managed to not meet each other. Good conversation, talk of Seattle, other devlish code happy things – and hopefully a beer or two to be had with good Christian in the near future in Seattle or Portland (or thereabouts).

I had the fortune of running into Alena Dzenisenka (@lenadroid) again doing what she does, which is tell, teach, and show people a whole of awesome F# handiwork. For instance at Dev Day she was throwing down some machine learning math and helping to get people started. She’s also got some talks lined up near the Cascadian (that’s Seattle and Portland, but also San Francisco and Dallas!!) lands if you haven’t noticed, so come get inspired to sling some functional code!

On day one the keynote by Chad Fowler (@chadfowler) was excellent. I’d not realized he was a fellow who escaped the south like I had all while playing a bunch of music! I was able to catch Chad and chat a bit on day two of the conference. His presentation was great, and he’s motivated me to give his book Passionate Programmer a read.

Another individual who I’d been aiming to meet, Mathias Brandewinder (@brandewinder), was also at the conference. I even attended some of his workshop and learned a number of things about F# and machine learning. I’m definitely inspired to dig deeper into many of the machine learning realm and start figuring out more of the truly amazing things we can do with computers and machine learning algorithms – I honestly feel like we’ve only skimmed the surface for much of this technology. Mathias also has a book, that is truly worth buying titled “Machine Learning Projects for .NET Developers“. If you’re curious, yes, I have the book and am working through it steadily!  🙂

Gary Short (@garyshort) provided an amazing talk on digging into crop yields via the European Space Agency Data Science Project. I also enjoyed the multiple conversations that I was able to have with Gary from the talk of “really really really awesomely excited Americans” vs. “excited Americans” all the way to the talks on the matter of data science and crop yields themselves. Gary’s talk is linked below, so get a dose of the crop yields yourself, and any complaints be sure to send to his @robashton twitter account!  (But seriously, you should follow Rob Ashton too as he’s got a lot of good twitter nuggets).

Another person I was super stoked to run into again is Tomas Petricek (who I hear might be in the Cascadian lands of the Seattle area in a month or so). I met Tomas at Progressive .NET Tutorials in London and enjoyed a number of good conversations, and his general awesome personality and hilarious demeanor! Not sure I mentioned, but he’s got some wicked F# chops too. He spoke about Understanding the World with Type Providers, which is something that you should watch as it’s an interesting way to wrap one’s mind around a lot of ideas.

I also, after many random conversations about a whole host of conversation in Functional Programming Slack (follow the link to sign up) chats, got to meet Krzysztof Cieślak (@k_cieslak). Krzysztof (and if you can’t pronounce his name just keep trying, you’ll get it right sometime around 2023) was great to meet and catch up with in person. Also great to hear tidbits about what he’s working on since he’s driving some really cool projects, including working on projects like Ionide Project for the Atom Editor.

There are so many people I enjoyed chatting with and getting to meet, which I really wish I had more time to hang out and chat or hack with everybody more. I met so many other individuals, that I already feel like a prick for not being able to write something about every single awesome person I’ve had a chance to speak with at Dev Day and the subsequent days after the conference. To those I didn’t, sorry about that, drinks and dinner are on me when you’re in Portland!

…on that note, get subscribed to Thrashing Code News so I can update you when the rumblings and dates of the next kick ass conferences, hackathons, hacking festivals, or other great materials, learnings, or such come up. In addition, get inspired to speak, or get involved in some way and help make the next conference you attend as kick ass as you’d want it to be! It’s easy, just fill out your name and email here.

…and to Michał and Rafał I’ll be following up with you guys on some of my next confrence efforts coming up in the Cascadian Pacific Northwest (i.e. Seattle/Portland area)! Cheers!

Strongloop

Framework: Strongloop’s Loopback

Recently I did a series for New Relic on three frameworks, both for APIs and web apps. I titled it “Evaluating Node.js Frameworks: hapi.js, Restify, and Geddy” and it is available via the New Relic Blog. To check out those frameworks give that blog entry a read, then below I’ve added one more framework to the list, Strongloop’s Loopback.

Strength: Very feature-rich generation of models, data structures and related enterprise-type needs. Solid enterprise-style API framework library.
Weakness: Complexity could be cumbersome unless it is needed. Not an immediate first choice for a startup going after lean and clean.
Great for: Enterprise API Services.

When I dove into StrongLoop, I immediately got the feel that I was using a fairly polished package of software. When installing with ‘sudo npm install -g strongloop’ I could easily see the other packages that are installed. But instead of the normal Node.js display of additional dependencies that are installed, the StrongLoop install displayed a number of additional options with a shiny ASCII logo.
Continue reading

Some JavaScript API Coding With Restify & Express & Hacking it With cURL …Segment #2

Ah, part 2! If you’re looking for part 1, click this link.

Review: In the last blog entry I went through more than a few examples of using cURL to issue GET requests against various end points using Node.js & Restify. I also covered the basics on where to go to find cURL in case it isn’t installed. The last part I covered was a little bit of WebStorm info to boot. In this part of the series I’m now going to dive into the HTTP verbs beyond GET.

POST

The practice around issuing a command via http verb to save data is via a post. When you issue a post via cURL use the -X followed by POST to designate a post verb, then -H to assign the content type parameter. In this particular example I’ve set it to application/json since my payload of data will be JSON format. Then add the final data with a -d option, followed by the actual data.

curl -X POST -H "Content-Type: application/json" -d '{"uuid":"79E5591A-1E54-4562-A276-AFC266F54390","webid":"56E62C3A-D6BC-4F4F-B72A-E6CE081190B6"}' http://localhost:3000/ident

Other data types can be sent, which the content type can be appropriately set for including; html, json, script, text or html. One example of this same command, issued with jQuery on the client side would actually look like this.

var data = {"uuid":"79E5591A-1E54-4562-A276-AFC266F54390","webid":"56E62C3A-D6BC-4F4F-B72A-E6CE081190B6"};

$.post( "http://localhost:3000/ident", function( data ) {
  $( ".result" ).html( data );
});

When building post end points via express one of the things you may run into is the following message being displayed in the console.

/usr/local/bin/node app.js
connect.multipart() will be removed in connect 3.0
visit https://github.com/senchalabs/connect/wiki/Connect-3.0 for alternatives
connect.limit() will be removed in connect 3.0

The immediate fix for this, until the changes are made (which may or may not mean to just alwasy  is to replace this line

app.use(express.bodyParser());

with these lines

app.use(express.json());
app.use(express.urlencoded());

So here’s some common examples for use from a great write up on writing basic RESTful APIs with Node.js and Express from the Modulus blog.

var express = require('express');
var app = express();

app.use(express.json());
app.use(express.urlencoded());

var quotes = [
    { author : 'Audrey Hepburn', text : "Nothing is impossible, the word itself says 'I'm possible'!"},
    { author : 'Walt Disney', text : "You may not realize it when it happens, but a kick in the teeth may be the best thing in the world for you"},
    { author : 'Unknown', text : "Even the greatest was once a beginner. Don't be afraid to take that first step."},
    { author : 'Neale Donald Walsch', text : "You are afraid to die, and you're afraid to live. What a way to exist."}
];

app.get('/', function(req, res) {
    res.json(quotes);
});

app.get('/quote/random', function(req, res) {
    var id = Math.floor(Math.random() * quotes.length);
    var q = quotes[id];
    res.json(q);
});

app.get('/quote/:id', function(req, res) {
    if(quotes.length <= req.params.id || req.params.id < 0) {
        res.statusCode = 404;
        return res.send('Error 404: No quote found');
    }

    var q = quotes[req.params.id];
    res.json(q);
});

app.post('/quote', function(req, res) {
    if(!req.body.hasOwnProperty('author') ||
        !req.body.hasOwnProperty('text')) {
        res.statusCode = 400;
        return res.send('Error 400: Post syntax incorrect.');
    }

    var newQuote = {
        author : req.body.author,
        text : req.body.text
    };

    quotes.push(newQuote);
    res.json(true);
});

app.listen(process.env.PORT || 3412);

This is a great little snippet of code to use for testing your curling against just to check out.

References:

Some JavaScript API Coding With Restify & Express & Hacking it With cURL …Segment #1 (with some Webstorm to boot)

So often I end up putting together some RESTful services (or the intent is to at least build them with that premise, but we all know how that ends up). The API URIs routing gets put together and one wants to take a crack at the service as soon as possible. Here’s a quick guide for using cURL to take some basic actions against the services and understand what you’re getting back.

The first thing to do is make sure you can run JavaScript, which means you have a computer. The second thing is to get cURL, which means you’re running some variant of Linux or UNIX. In most scenarios one would be running OS-X. The easiest way to determine if it is installed on your computer just open up a terminal and type ‘curl –help’. You should get a result with all the switches, which is almost always a bit of overload.

$ curl --help
Usage: curl [options...]
Options: (H) means HTTP/HTTPS only, (F) means FTP only
     --anyauth       Pick "any" authentication method (H)
 -a, --append        Append to target file when uploading (F/SFTP)
     --basic         Use HTTP Basic Authentication (H)
     --cacert FILE   CA certificate to verify peer against (SSL)
     --capath DIR    CA directory to verify peer against (SSL)
 -E, --cert CERT[:PASSWD] Client certificate file and password (SSL)
     --cert-type TYPE Certificate file type (DER/PEM/ENG) (SSL)
     --ciphers LIST  SSL ciphers to use (SSL)
     --compressed    Request compressed response (using deflate or gzip)
 -K, --config FILE   Specify which config file to read
     --connect-timeout SECONDS  Maximum time allowed for connection
 -C, --continue-at OFFSET  Resumed transfer offset
 -b, --cookie STRING/FILE  String or file to read cookies from (H)
 -c, --cookie-jar FILE  Write cookies to this file after operation (H)
     --create-dirs   Create necessary local directory hierarchy
     --crlf          Convert LF to CRLF in upload
     --crlfile FILE  Get a CRL list in PEM format from the given file
 -d, --data DATA     HTTP POST data (H)
     --data-ascii DATA  HTTP POST ASCII data (H)
     --data-binary DATA  HTTP POST binary data (H)
     --data-urlencode DATA  HTTP POST data url encoded (H)
     --delegation STRING GSS-API delegation permission
     --digest        Use HTTP Digest Authentication (H)
     --disable-eprt  Inhibit using EPRT or LPRT (F)
     --disable-epsv  Inhibit using EPSV (F)
 -D, --dump-header FILE  Write the headers to this file
     --egd-file FILE  EGD socket path for random data (SSL)
     --engine ENGINE  Crypto engine (SSL). "--engine list" for list
 -f, --fail          Fail silently (no output at all) on HTTP errors (H)
 -F, --form CONTENT  Specify HTTP multipart POST data (H)
     --form-string STRING  Specify HTTP multipart POST data (H)
     --ftp-account DATA  Account data string (F)
     --ftp-alternative-to-user COMMAND  String to replace "USER [name]" (F)
     --ftp-create-dirs  Create the remote dirs if not present (F)
     --ftp-method [MULTICWD/NOCWD/SINGLECWD] Control CWD usage (F)
     --ftp-pasv      Use PASV/EPSV instead of PORT (F)
 -P, --ftp-port ADR  Use PORT with given address instead of PASV (F)
     --ftp-skip-pasv-ip Skip the IP address for PASV (F)
     --ftp-pret      Send PRET before PASV (for drftpd) (F)
     --ftp-ssl-ccc   Send CCC after authenticating (F)
     --ftp-ssl-ccc-mode ACTIVE/PASSIVE  Set CCC mode (F)
     --ftp-ssl-control Require SSL/TLS for ftp login, clear for transfer (F)
 -G, --get           Send the -d data with a HTTP GET (H)...

Don’t get intimidated! It goes on and on and on, but just know it’s installed if you see all these goodies. If you don’t get the results above, then installing cURL is the next step. I’ll leave that to you. Here’s some links to download and get started however.

Next you’ll of course need Node.js and Restify installed. I’ll assume you have Node.js installed. Create a directory and in that directory just run the following command.

npm install restify

Next create a file called server.js in that directory you’ve just installed restify in. Here’s the initial JavaScript code for that file that I’ve used to put together for the first few examples of using cURL.

var restify = require('restify');

function respond(req, res, next) {
    res.send('hello ' + req.params.name);
}

var server = restify.createServer();
server.get('/hello/:name', respond);
server.head('/hello/:name', respond);

server.listen(8080, function() {
    console.log('%s listening at %s', server.name, server.url);
});

Ok, now to run this with node.js just issue the command to launch node.js with this file that was just created.

node server.js
restify listening at http://0.0.0.0:8080

Getting Get

Now the service is running on port 8080 against 0.0.0.0. To check out what a standard GET verb will do in a browser, open up a browser and navigate to http://0.0.0.0:8080.

Browsing the GET response via Chrome.

Browsing the GET response via Chrome.

You’ll see this in the browser window. Just straight plain text too. If you look at source, this is all you get back. Now open up a terminal and run the following cURL command to execute a GET against the URI & port. This is the most basic cURL command one can make. It is simply issuing a GET request against the URI and will display the body of the response.

curl 0.0.0.0:8080

The response will be similar to this for the particular request.

{"code":"ResourceNotFound","message":"/ does not exist"}

Your terminal will probably stick the subsequent prompt at the end of the result too, because the result doesn’t end in a newline. Beware of that, your prompt hasn’t disappeared. 😉

To get a little more information you can get the header of the response dumped into the terminal with a -i. The -i option stands for –include, to include the header. Issue the command as either line shown below.

curl -i http://0.0.0.0:8080
curl --include http://0.0.0.0:8080

The response will be provide a little bit more about what is going on.

HTTP/1.1 404 Not Found
Content-Type: application/json
Content-Length: 56
Date: Wed, 27 Nov 2013 00:27:36 GMT
Connection: keep-alive

{"code":"ResourceNotFound","message":"/ does not exist"}

With this response the actual response error code number is shown. In this case we have a 404, which points us to the problem with this curl request. The server isn’t returning anything to our curl request. If we look at the code, we can see that the ‘get’ route is setup as ‘/hello/:name’ which means that the domain root is only looking at http://url_root/hello/someName for a request to be made in order to return a response.

var server = restify.createServer();
server.get('/hello/:name', respond);
server.head('/hello/:name', respond);

Issue a command against the server now with the following curl request.

curl -i http://0.0.0.0:8080/hello/Adron

The response should come back as an actual response with content.

HTTP/1.1 200 OK
Content-Type: application/json
Content-Length: 13
Date: Wed, 27 Nov 2013 00:34:04 GMT
Connection: keep-alive

"hello Adron"

Here the content is returned as “hello Adron” and the header returns a 200. The content type is application/json format with the length returned as 13. Note also the connection is set to keep-alive. Let’s dive into that.

If we change the connection type, which is important for many scenarios, we have to send extra header information to ask for the response to be returned accordingly. In order to do that we can pass the -H or –header option in with the curl request. If the command is issued with an -i and -H as shown below the result will be as follows.

curl -iH "connection: close" http://0.0.0.0:8080/hello/Adron
HTTP/1.1 200 OK
Content-Type: application/json
Content-Length: 13
Date: Wed, 27 Nov 2013 00:41:07 GMT
Connection: close

"hello Adron"

If we take away the -i we’ll just get the response, which is “hello Adron” and wouldn’t get the header, which now returns Connection: close in the response. By default, curl sets the connection as keep-alive, but in order to make the request return right away the connection needs to be issued a request for it to close. By setting the -H or –header value of connection to close, we get the response immediately. With restify, it is also important to note that it checks if the user agent is curl.

If it is curl the connection header to close and removes the content-length header. However I’ve experienced that restify is not doing this in all circumstances or that the use of curl is being changed in some of my usage. So don’t always assume that this will be the case. The safest bet is to set the connection closed when done. Thus, adding -H or –header and setting connection to close with a “Connection: close”.

Beyond Basic Get

Ok, so that’s a pretty solid use of GET with cURL. Let’s dive into some puts and deletes with a get or two thrown in for comparison. Change the executing code to the code shown in the server.js file below.

var restify = require('restify');

function send(req, res, next) {
    res.send('hello ' + req.params.name);
    return next();
}

var server = restify.createServer();
server.post('/hello', function create(req, res, next) {
    res.send(201, Math.random().toString(36).substr(3, 8));
    return next();
});
server.put('/hello', send);
server.get('/hello/:name', send);
server.head('/hello/:name', send);
server.del('hello/:name', function rm(req, res, next) {
    res.send(204);
    return next();
});

server.listen(8080, function() {
    console.log('%s listening at %s', server.name, server.url);
});

The first section of code to check out is around the function send.

function send(req, res, next) {
    res.send('hello ' + req.params.name);
    return next();
}

This function is setup to take req, res, and then handle next. The req is the request, the res is the response and the next is for issuing to return and continue with the result. The next bit of code starts the server with the restify.createServer();. Just below that there are several handlers that are setup.

server.post('/hello', function create(req, res, next) {
    res.send(201, Math.random().toString(36).substr(3, 8));
    return next();
});
server.put('/hello', send);
server.get('/hello/:name', send);
server.head('/hello/:name', send);
server.del('hello/:name', function rm(req, res, next) {
    res.send(204);
    return next();
});

Now at this point I got a little sidetracked writing this blog entry. But I thought to myself, “hell, I’m just figuring out some parts of Webstorm, I ought to blog a little about it!” So, here’s…

A Little Webstorm Love

Webstorm and cURL. Click the image for a full size image.

Webstorm and cURL. Click the image for a full size image.

Before continuing on I wanted to cover a few tidbits of the Jetbrains Webstorm IDE. I often switch back and forth between the Sublime/Terminal combo and the Webstorm IDE. The really cool thing about this IDE is that it actually has a Terminal built in, color coding and autocomplete of the code, refactoring, and file and folder viewer and a whole slew of other features. In the image above that I’ve included there are four neon pointers that are displaying some of the key functionality that I’m using to work through this blog entry with cURL and Restify.

The arrows, from left to right are pointing to the following IDE elements. The first is pointing to the javascript files storgie.js and starter.js which I added specifically to show the git status colors. Each color reflect if the file is new (green), has changes (light blue) or is committed with no changes (white). The second arrow is just pointing to the general folder structure. Here you can see the hidden .* files like the .gitignore and .npmignore and also easy to dig through the node_modules directory. Webstorm also uses the node_modules directory to provide extra information and autocomplete to the code as you work through your coding session. The next arrow is pointing out the terminal in the editor, which is where I’m working up the curl examples in this blog entry. Then of course the color coded starter.js file that is one of the working examples. Webstorm, simply, is pretty sweet. I’m looking to do some more walk throughs and work sessions with the editor in the near future. So if interested, be sure to keep reading and subscribe, I’ll be sure to post any links to wherever the material ends up right here.

Now, back to the cURLing. 😉

After I toyed around with Webstorm and bit to get it work in a way that was efficient for me to use it for developing these APIs I stumbled into an idea. I’d provide a page for the APIs that could be located at the root of the API service such as http://api.blagh.com. The APIs would still be a restful type schema like http://api.blagh.com/thing/create or http://api.blagh.com/thing/destroy but at the very root would be a kind of docs. Maybe this could just be a status page even. Whatever the case, there needs to be something at http://api.blagh.com so I decided right then and there I’d switch to express.js to build the rest of the API services. Restify is fine and all but for this, it seemed like express would have all of the pieces I need for this.

Just to boot, I then read a few articles about express being faster such as this one. But then I read this issue on github and almost thought, “maybe I should keep using restify” but then I thought, “dammit, just get it done the way you want it built” so it was back to express. It’s easy enough to change this later so I just got back to coding, albeit with express now. So keep reading and in the next day or two I’ll have part two of this series on using cURL to hack at your APIs.

Enjoy the composite coding & cheers!

References:

Getting Github : JavaScript Libraries Spilled EVERYWHERE! Series #003

This is an ongoing effort putting together some JavaScript app code on client and on server that started with blog entry series #001 and #002.

This how-to is going to kind of go all over the place. My goal is to get github data. The question however is, how and with what. I knew there were some available libraries, so writing straight and pulling straight off of the API myself seemed like it would be unnecessary work.

The github API documentation is located at http://developer.github.com/v3/ with the list of client libraries for ease of access listed at http://developer.github.com/v3/libraries/. The first two that I forked and cloned were the gh3 and npm installed the octonode and node-github libraries.

Node.js Based Github Libraries

The two node based projects install via npm, as things go with node and were super easy. The first one I gave a test drive to is the https://github.com/ajaxorg/node-github project. I forked it and dove right in.

$ npm install github
npm http GET https://registry.npmjs.org/github
npm http 200 https://registry.npmjs.org/github
npm http GET https://registry.npmjs.org/github/-/github-0.1.8.tgz
npm http 200 https://registry.npmjs.org/github/-/github-0.1.8.tgz
$

After that quick install I took a stab at the test code they have in the README.md.

var GitHubApi = require("github");

var github = new GitHubApi({
    // required
    version: "3.0.0",
    // optional
    timeout: 5000
});
github.user.getFollowingFromUser({
    user: "adron"
}, function(err, res) {
    console.log(JSON.stringify(res));
});

This worked all well and good, so I moved on to some other examples. The following example however needed authentication. To authenticate you’ll need to add the little snippet below with the username and password. However there’s also a Oauth token method you can use too, which I’ve not documented below. To check out other auth methods check out the documentation.

var GitHubApi = require("github");

var github = new GitHubApi({
    version: "3.0.0", timeout: 5000,
});

github.authenticate({
    type: "basic",
    username: "adron",
    password: "yoTurkiesGetYourOwn"
});

github.orgs.get({
	org: "Basho"
}, function(err, res){
	console.log(res);
});

The result is prefect for putting together a good display page or something of the organizations.

$ node adron_test.js
{ login: 'basho',
  id: 176293,
  url: 'https://api.github.com/orgs/basho',
  repos_url: 'https://api.github.com/orgs/basho/repos',
  events_url: 'https://api.github.com/orgs/basho/events',
  members_url: 'https://api.github.com/orgs/basho/members{/member}',
  public_members_url: 'https://api.github.com/orgs/basho/public_members{/member}',
  avatar_url: 'https://secure.gravatar.com/avatar/ce5141b78d2fe237e8bfba49d6aff405?d=https://a248.e.akamai.net/assets.github.com%2Fimages%2Fgravatars%2Fgravatar-org-420.png',
  name: 'Basho Technologies',
  company: Basho,
  blog: 'http://basho.com/blog/',
  location: 'Cambridge, MA',
  email: null,
  public_repos: 105,
  public_gists: 0,
  followers: 0,
  following: 0,
  html_url: 'https://github.com/basho',
  created_at: '2010-01-04T19:05:19Z',
  updated_at: '2013-03-17T20:29:09Z',
  type: 'Organization',
  total_private_repos: YYY,
  owned_private_repos: XXX,
  private_gists: 0,
  disk_usage: 788016,
  collaborators: 0,
  billing_email: 'not_a_valid_address@basho.com',
  plan: { name: 'platinum', space: 62914560, private_repos: billions },
  meta: { 'x-ratelimit-limit': '5000', 'x-ratelimit-remaining': 'azillion' }
}

Now at this point there’s a few significant problems. Setting up tests of the integration variety for this library gets real tricky because you need to authenticate, or at least I do for the data that I want. This doesn’t bode well for sending any integration tests or otherwise to Travis-CI or otherwise. So even though this library works, and would be processed on the server-side and not on the client side, having it as a non-tested part of the code base bothers me a bit. What’s a good way to setup tests to verify that things are working? I’ll get that figured out shortly and it’ll have to be another blog entry, maybe. For now though, let’s jump into the client side library and see how it functions.

Client Side JavaScript Github

For the client side I started testing around with the gh3 library. It has two dependencies, jQuery and Underscore.js. jQuery is likely always going to be in your projects. Underscore.js is also pretty common, but sometimes you’ll find you’ll need to go download the library. Upon download and getting the additional libraries I needed installed, I gave the default sample a shot.

<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8">
    <title>gh3 Sample</title>
</head>
<body>
    <ul id="user"></ul>
</body>
<script src="js/jquery-1.7.1.min.js"></script>
<script src="js/underscore-min.js"></script>
<script src="js/gh3.js"></script>
<script>
	var adron = new Gh3.User("adron")
	,	userInfos = $("#user");

	adron.fetch(function (err, resUser){
		if(err) {
			throw "outch ..."
		}
		console.log(adron, resUser);
		_.each(_.keys(resUser), function (prop) {
			userInfos.append(
				$('<li>').append(prop+" : "+resUser[prop])
			);
		});
	});
</script>
</html>

This worked pretty seamlessly. Also it got me thinking, “what do I really want to do with the github library?” If it’s a server side service, obviously I’d want to use the Node.js libraries probably. However if it is client side data I want, is it even ideal that the server side actually pull the data anyway? The other issues around cross site scripting and related matters come into play too if it is a client side script, but this might be, even in spite of that, just what I needed. For now, that left me with some solid things to think about. But I was done for now… so until next entry, cheers!

ALT.NET 2011 Day #1

Today kicked off with an early morning and a great workshop with Glenn Block (@gblock). The team he is working with has some great things coming for HTTP + REST + WCF that will really alleviate a lot of problems with WCF. In addition to that I’d bet that it will give Microsoft a chance to get back into the web API market.

That brings me to another topic that has come up a lot lately. Anyone that is in the startup scene or web development scene knows that Ruby on Rails has made an absolutely massive impact. I’m talking an impact like the invasion of Normandy! Microsoft has made many shifts to counter the ease, simplicity, and elegance of Ruby on Rails with things like ASP.NET MVC. Overall, the efforts have done a good job and been well received by .NET Developers in general.

However Microsoft has done a horrible job of getting aligned with the Internet startup space when it comes to web APIs. If you’re not sure what I mean, check out Twitter, Facebook, Apigee, and dozens of others. These are all companies that provide web APIs. Another notable one that has had some very money related impact, is the Best Buy API. Allowing people to hook into the API to really turn some revenue and make money. These APIs are almost always non-Microsoft stack technologies. When they are Microsoft, it is usually a hack around ASP.NET MVC or something of that sort to enable a more RESTful type API. With these additions to WCF this puts Microsoft back on some solid footing to compete in this space. I’m really looking forward to being able to pay around with these capabilities of WCF more – and hopefully sooner than later!

The second session of the day was a kind of modern anthropological study of development groups, their culture, and how processes, tools, and team qualities interplay among people. We ended up splitting into 3 (or was it 4) groups and went about various exercises.

I’m not sure the exact conclusions we came to during this session, but it was fun just to discuss each of our development groups. Ranging from topics of how we are forced to use Waterfall (and those clients often end up paying absurd amounts of money for things that should cost them less), how “pairing programming doesn’t exist” which gave us programmers that pair frequently a good laugh, and a whole host of other topics.

All in all, day one has started off great. I’m really stoked to be attending these workshops this year. Last year I was hard at work at… some client related debacle of crazy proportions, frantically looking forward to the weekend when the conference would start in earnest. But this year the team I work with are hittin’ the workshops early and doing the ALT.NET Conference completely.

Tomorrow is a session on AppHarbor, also known as Azure done right, and an xUnit Workshop by Brad Wilson that I’ll be attending. I’m already excited I’m not sure I’ll sleep. Until then, cheers.