Thursday, December 18, 2008

Google Friend Connect site authentication

OK. Here's the deal:

I want to use GFC to authenticate users for my site.

It sounds like it should be doable, right? I think it is, but at the moment it's kind of hard. In fact it's so hard (or I'm so soft) that I haven't been able to do it (yet). Let recap a bit what it is GFC (Google Friend Connect) does;

1. If you enable GFC for your site, you can add GFC widgets which let people comment and rate the page you add the widgets to and interact in various ways. You can also add any custom OpenSocial widget you like (AFAIK)

2. One of the widgets you must have is the login widget which let anybody 'join' your site. When they do so, Google open up a separate window which let them choose their method of authentication; Google, Yahoo, OpenId, et.c., with various suboptions in Google's case).

3. Now Google keeps track of unique users that act on the widgets on your site, let them unjoin, log in and out, and so on.

4. The original identity provider (Google, Yahoo, et.c.) is still used as always for changing password, user info, et.c.

This is really great. The immediate thought when realizing how things work is that you want somehow to leverage this great new functionality that Google provide: To have a secure, authenticated site where you don't need to manage anyones identity, password changes, et.c.

So, I started trying to do so.

The first easy thing was to create a custom OpenSocial widget on my own site, that I pull in using the custom widget option on the 'social gadgets selection' on the GFC admin site; http://www.google.com/friendconnect .

What happens is that you let GFC generate a HTML snippet that loads an open social script from google, which in turn pulls in your script - through - google's proxy server. In any event, the script resides on a directory on your server, and get to land on you page (which was just recently in another directory) and can then suddenly utilize all GFC magic - check the id and name of current viewer, his/hers friend list, et.c. Neat.

But.

If you just post this info back to your server using standard Ajax - and trust the info you get - you're lost. How can you be sure that it really was your script that send that info? No, you need to make Google's GFC server send certified info back to your server about the current viewer.

Luckily enough, there's an OpenSocial call called (..) makeRequest, which makes an ajax call through the Container (Googl'es GFC servers, in this case, since Google/GFC is the Container. The Container would normally be LinkedIn, MySpace, Orkut, et.c.) to any given destination. For example your server, which actually hosts the page where the GFC widgets are hopping about.

http://code.google.com/apis/opensocial/docs/0.8/reference/gadgets/#gadgets.io.makeRequest

Is a great place to learn what can be done with this. For one thing, you can send back some good info. My simple OpenSocial script did just this, and made sure to begin in a nice and easy manner, not forcing any encryption, using AuthorizationType.NONE, like this;

...
...
var params = {};
params[gadgets.io.RequestParameters.CONTENT_TYPE] = gadgets.io.ContentType.TEXT;
params[gadgets.io.RequestParameters.AUTHORIZATION] = gadgets.io.AuthorizationType.NONE;
params[gadgets.io.RequestParameters.REFRESH_INTERVAL] = 5;
var url = "http://xxxxxxxxxxxxx";
console.log("calling url... '"+url+"'");
gadgets.io.makeRequest(url, reqcb, params);
};

function reqcb(data)
{
console.log("reqcb called....");
console.dir(data);
}
...
...

And this works well. Looking more carefully into the link above, one reads the following:

If opt_params[gadgets.io.RequestParameters.AUTHORIZATION] is set to gadgets.io.AuthorizationType.SIGNED, the container needs to vouch for the user's identity to the destination server. The container does this by doing the following:

  1. Removing any request parameters with names that begin with oauth, xoauth, or opensocial (case insensitive).

  2. Adding the following parameters to the request query string:

    opensocial_viewer_id
    Optional.
    The ID of the current viewer, which matches the getId() value on the viewer person object.
    opensocial_owner_id
    Required.
    The ID of the current owner, which matches the getId() value on the owner person object.
    opensocial_app_url
    Required.
    The URL of the application making the request. Containers may alias multiple application URLs to a single canonical application URL in the case where an application changes URLs.
    opensocial_instance_id
    Optional.
    An opaque identifier used to distinguish between multiple instances of the same application in a single container. If a container does not allow multiple instances of the same application to coexist, this parameter may be omitted. The combination of opensocial_app_url and opensocial_instance_id uniquely identify an instance of an application in a container.
    opensocial_app_id
    Optional.
    An opaque identifier for the application, unique to a particular container. Containers that wish to maintain backwards compatibility with the opensocial-0.7 specification may include this parameter.
    xoauth_public_key
    Optional.
    An opaque identifier for the public key used to sign the request. This parameter may be omitted by containers that do not use public keys to sign requests, or if the container arranges other means of key distribution with the target of the request.
  3. Signing the resulting request according to section 9 of the OAuth specification.


So, if I could use a SIGNED request instead, the GFC proxy servers would add all the info I need and crave for my service, especially a guaranteed identity of the current page viewer. Using this unique id in my tables gives me to power to associate this identity with anything I'd like on my site. Ok. Fine.

What happens when I use SIGNED? Well, the callback function always gets back the verbose, detailed and informative error message "404: Not found", in a JSON object, but still..

Now you might think that I don't know the first thing about public cryptography, signing requests and tossing certificates about in the rich and useful variety of almost-compatible formats the joyful and friendly cooperation between related companies in the beginning of this century gave rise to, but you'd be wrong.

I can't call myself an expert (any longer), but I have quite a good working knowledge of the above. the problem here is that the documentation makes references to the public certificate of the container, which the receiving service need to verify.

great, but the Container is GFC. Where's their certificate? Maybe this is obvious, I don't know. It isn't for me, anyway. Reading another page which digs deep in the OAuth spec, I find a links I've never seen before, which let me register sites with google and deploy their public certificates;

http://code.google.com/intl/sv-SE/apis/gdata/articles/oauth.html


Has some links that lead to the registration page for web-based applications;

https://www.google.com/accounts/ManageDomains

So I do just that, and use the OAuth playground;

http://googlecodesamples.com/oauth_playground/


to verify that things (seem to work).

Now, I'm not sure that my site, not being an OpenSocial container (surely!) need to register a certificate with Google, but just in case this was needed to make the SIGNED request work, I did so and played around with it.

No luck. The result stays the same; "404: not found". Great.

What I (and possibly **ALL** other site owners starting to use GFC) need, is some information. Anything of the below will be fine, if not else to give some info back to the community instead of the usual Google no-walling;

1. Here's a simple example which demonstrates how to write a simple OS gadget which post back the current viewers unique id and info.
2. Rick caught the flu last week and haven't had the time to write the example. We'll get on it in a couple of weeks.
3. We'll never give that information, stupid Englishmen! Go and boil your bottoms, sons of a silly person!
4. It's impossible to do what you want. Go back to bed.
5. Wow, that's an interesting idea. We hadn't thought of that! We'll get back to you..

Having emptied my reservoir of frustrated incompetence, I must end with stating the fact that despite all the trouble I've had in trying to get the site authentication to work, I've at least got my money's worth :)


[UPDATE]

Kevin Marks commented on one of my posts on the open social mailing list that GFC does not support signed requests at the moment. We got a number 4!! Yay! Thanks for the info, and let's hope that this gets added real soon.

Cheers,
PS

Saturday, December 13, 2008

Towards a better integration of i18n in Dojo widgets

Maybe this is old hat to some, but I've been doing some research into i18n and come up with what I think is an interesting way to get better integration with internationalization in layout containers.

To recap, Dojo has, and uses internally, an internationalization system which makes sure that the Cancel button button read 'Anulla' if your browser says that your locale is 'it', and so on.

I have been thinking about various ways of making the use of i18n more dynamic. My first stab at this was a small widget, which could be embedded in a page (at any amount of places) which treated its innerHTML as a key string to be looked up in the i18n system and exchanged for the actual message du jour.

This was a rather heavyweight solution (and I also wrongly used dojo.requireLocalization inside the widget, instead of by the dojo.require statements where it should be. It was just insane to do that every time an instance of the widget was created, but that's how I roll :), and was unable to handle translations of html properties, like the title="" inside an anchor element, for example.

I then thought of extending dijit._Templated (the superclass of all Dojo widgets whichuses 'normal templates) to do roughly the same for all all widget templates. But somehow it felt not really spot on for my problem.

What I had, was a number of pages, which I had been translating to a number of hmtl 'snippet' files, which only contained the markup for what they wanted to do - no body, header or stuff like that. They are used inside ContentPanes which are managed in turn by a StackContainer, which simulates page transitions but within preloaded html snippet, getting you a mighty quick site where all parts are already loaded in the browser,but only one shown at a time.

Each of these 'pages' have a lot of text, which should be translated, as well as a number of title="" and related element properties which also should be translated, so having a solution which only worked for widget templates would only solve part of the problem.

SP what Idid was to create a widget which sublassed dijit.ContentPane, and which overrode the function called when the pane is done loading remote content (the file containing the html snippet), onDownloadEnd.


dojo.provide("layout.TranslationPane");

dojo.require("dijit.layout.ContentPane");
dojo.require("dojo.i18n");

dojo.declare("layout.TranslationPane", dijit.layout.ContentPane,
{
onDownloadEnd: function(pane)
{
console.log("layout.TranslationPane callback onDownloadEnd called.");
var html = this._getContentAttr();
var res = dojo.i18n.getLocalization("layout", "salutations");
html = dojo.string.substitute(html, res);
this._isDownloaded = true;
this.setContent(html);
}
});

Not much code? No, that's because I solve my problems a) at the machine they appear, b) in JavaScript, and c) on the shoulders of Dojo. So there.

That leave me with snippet files which contain any number of ${title_of_stuff} anywhere at all, since the 'page' is treated as a string.

I think that this would be a great thing to add to ContentPane permanently, with a couple of extra properties to it, which would point out locale overide and (not optional) the name of the package and translation file to use for any ${keywords} appearing in the markup loaded.

WDYT?

Cheers,
PS

Monday, December 1, 2008

Sample chapter from my new book - Layout


Promotion is a strange thing. 

I mean, I have written this book and it would be nice if people read it. 

However, I'm no marketing expert and despite appearances, tend to be a bit dismissive of my achievements.

Having said that, let me tell you what's great about the book I wrote;

1) It's just 250 pages long
2) It covers dojo.data in the most approachable way I could write about it - not common :)
3) It delves into as many parts of dojox as I could muster, not just the usual stuff but DragPanes, ExpandoPanes and GridContainers, with examples and tutorials as well.
4) Uses a lot of the actual (svn) source code of Dojo to show and explain different kinds of funcitonality.
5) Describes the creation of custom Dojo widgets at every opportunity.
6) Heavily opinionated.

That's not enough for you, eh? OK, grab a sample chapter on layout right here, and tell me what you think. No really, please mail me so I get some feedback on how usable it is to actually learn Dojo, since that's what the title is about, isn't it?

Cheers,
PS

Tuesday, November 25, 2008

The power of Dojo i18n

I had said at an early stage that I was going to use Dojos i18n system to translate snippets of text in World Change Network. Now that the time came to actually implement it, I realized I had some learning to do to get all bits and pieces right. As it turned out, using Dojos i18n system for you own purposes is really simple.

If you take a look at the unit test for i18n at http://download.dojotoolkit.org/current-stable/dojo-release-1.2.2/dojo/tests/i18n.js you see that it uses three major tricks;

1. Load the i18n strings bundle of your choice using dojo.requireLocalization("tests","salutations",locale);
2. Get a specific bundle using var salutaions = dojo.i18n.getLocalization("tests", "salutations", "en");
3. Get the specific string for a given key using salutations['hello'];

Let's check the arguments for number (1) and (2; Only the first two are needed, which describe the dojo package to look for bundles in and the name of the bundle. If you want to specify another locale than the one the browser declares, it can be added as third argument, "sv" for Swedish, when the browser would say "en-us", et.c.

All well and good, but where do we put our different version of strings? As it turns out, in the dojo/test directory a directory named 'nls' can be found. In the root of that is a file called 'salutations.js'. This is the default key-value translation that i18n falls back on if the locale cannot be found.

Then comes a host of subdirectories ; zh, nl, il, dk, de, et.c et.c., one for each locale (that you want or can define). In each of these is a separate 'salutations.js' containing locale-specific resources.

The file format look like this;

{
it: "Italian",
ja: "Japanese",
ko: "Korean",
......
hello: "Hello",
dojo: "Dojo",
hello_dojo: "${hello}, ${dojo}!",
file_not_found:"The file you requested, ${0}, is not found."
}

for the default file, and like this in the 'it' subdirectory;

{
it: "italiano",
hello: "Ciao"
}

And that's it.

I began creating a small naive custom widget, which subtituted the content of the element where it is declared with the localized string found for the content used as key. It's not very fast, but simple to understand and use.

dojo.provide("layout.translate");

dojo.require("dijit._Templated");
dojo.require("dijit._Widget");
dojo.require("dojo.i18n");

dojo.declare("layout.translate", [ dijit._Widget ],
{
widgetsInTemplate : false,
string : "",

postCreate: function()
{
if (!this.string)
{
this.string = this.domNode.innerHTML;
}
console.log("postCreate for layout.translate called. locale == '"+dojo.locale+"'");
dojo.requireLocalization("layout", "salutations");
var res = dojo.i18n.getLocalization("layout", "salutations");
console.log("Translation key was '"+this.string+"'");
var str = res[this.string];
this.domNode.innerHTML = str;
}

});

Note that I just copied the whole nls directory from dojo/test for my own uses, and edited the files, leaving the original filename intact, just in case :) A better way of utilizing i18n would be to have a base class for all custom widgets, which read in a i18n bundle, and injects all keys into each class, so that every subclass has a lot of this._i18n_command_delete (If we have a naming convention that let all i18n keys begin with '_i18n_' for example).

Then we could have all custom widget templates just sprinkle their markup with a lot of ${_i18n_command_delete} and so on, which would pull in the current value of that 'this' property of the widget when it is rendered in the page.

Hmm....

Come to think of it, it seems to be possible for this to be put inside dijit._Widget, or possibly _Templated, which would make it spread to all custom widgets automatically. The only thing needed would be to prepend '_i18n_' to all key names, so that 'command_delete' inside a bundle file would become this_i18n_command_delete in the widget.

One would also need to have a convention that this only worked if the developer put a 'nls' directory under the package directory where widgets of a certain package are declared, following the order declared earlier.

Actually, this would be a pretty neat idea. Just my fault for using a blog post instead of TRAC to add feature requests! Oh, you mean I could do it myself? OK, I'll add it to the queue :)

Cheers,
PS




Monday, November 17, 2008

Codebits 2008









Last Wednesday I flew down to Lissabon to attend and talk at the SAPO Codebits conference. I had a really great time, meeting a lot of people I've only tweeted or mailed with before.

As always, Lissabon is a treat. People are helpful and very respectful. A colleague made a very spot on question today when I told him about the trip: "Would you like to go back there again soon", to which I would empathically answer Yes! :)

Many times, when you've spent a couple of days in a city you get kind of fed up on the experience, but Lissabon is not one of those towns.

It didn't hurt one bit that I left a cold and rainy Stockholm landing in a sunny and warm (17-20 degrees centigrade) Lissabon.

The Hotel (Galé 'Opera') and the venue were located in an area just alongside the 'Docas', an hours walk or so from the city centre but with a lot of rustique ambience all of its own.

I had so many things to attend to before leaving that I was only partially read up on the conference and my fellow speakers. It turned out that the conference was much larger that I had anticipated with a host of Portugals finest hackers.

In many ways the Codebits conferenc felt like Google I/O; A lot of multi-coloured couches, interesting lighting, free soda and coffee and the ability to turn to _anyone_ at random and strike up a conversation which revolved around important and interesting things, with the person in questions be able to understand you and bring his or her own into the discussion like nothing to it.

One of my personal surprises was that Mário Valente, who I knew only as a very smart person with a stupendously correct view of the current trends in web development, as well as a extremely decent Python and JavaScript hacker. It turned out he started Portugal's first ISP as well as a couple of other successful ventures - and is also in the middle of something bigger, more of which perhaps later..

We had been planning on doing some SSJS hacking during the conference but I got entangled in some late minutes changes to my book and Mário had both teaching and business issues to attend to, so it reamined a very good idea. Mário had some really good inpout on the current problems of Server-Side JavaScript efforts; 1) There's no 'killer' framework (yet) for SSJS, 2) If one were to create on, it would not be portable, due to the fact that the system APIs that RHino exposes are not the same that Spidermonkey (for example) does.

So we were thinking along the lines of arranging for a standard SSJS low-level API, based on the Java-APs exposed by Rhino (Mário's idea) so that SSJS implementors would have a stable platform to work from.

Also we're planning to do a thorough survey of the cloud or no-cloud SSJS frameworks available today to rank and get inspired by best-of-breed features.

My talk? It went OK. I managed to miss speaking to anyone in charge which meant I didn't get a headset, which meant I had to hulk under my 20.1 Inch portable Acer 9920, due to the fact that the only other mic was welded to the pulpet.

No worries though. Expect that I had the wrong settings for the TV-out in Kubuntu and had to turn my head for every slide to orient myself, having no other feedback on the screen. And I got a call in the middle, having forgot to turn off my phone, but other than that, sure things went OK.

I also had a great time speaking to Jan Lehnardt from CouchDB about REST, Dojo, Erlang and (of course) CouchDB.

On the speakers dinner I happened to sit across Mitch Altman who had a great number of things to say on all subjects possible, from electronics, dairy products and Brian Eno to art, music, war and interesting people in general. I tried to chip in with some good comments and since this is as good time as any, the books I spoke about was 'My war gone by, I miss you so' (horrible title, one of the best books I've read - not nearly as gory as you might think) by Anthony Loyd, A year of swollen appendices by Brian Eno and On a faraway beach by David Sheppard.

Another member of the table proved to have alarmingly good knowledge of the inner workings of russian botnet 'corporations', the current evolution of 'black' insurance and business agreements concerning buying and seelling botnet services, the 'double-NAT-with-homebrew-VM' solution of todays malware containers and the trend towrds the botnets becoming more of a symbiotic affair from the earlier parasitic standpoint, what with the automatic patching and maintenance that must be provided to ensure smooth runnings. A horror ride, to be sure, but not the less fascinating for that.

Actually, I spoke to so many, most of you had no cards, but please mail me if you want to stay in contact (my address is to the side on this page).

In all I met an enourmous amount of friendly and interesting people (Celso, João, Jack and everyone else) that it'll probably take me weeks to sort out the experience.

Thanks a lot to everybody involved!

Wednesday, November 5, 2008

A reasonably complicated custom Dojo widget example


I get a lot of questions from people on how to do this and that with Dojo, sometimes very specific and sometimes about how to approach problems in general. I don't really consider myself experienced to speak about all things Dojo, I'm actually just more of a fanboy.

Most of my answers, though, boil down to some basic things, were the most common is : make a custom widget. This is key, but seemingly missing from many frameworks/toolkits today. I feel like there has been three kind of 'generations' in JavaScript usage in the browser in recent years, and many of the flame-wars and misunderstandings might be due to the fact that people have a lot of misaligned assumptions when speaking of JavaScript programming. In my view the different generations or stages has been the following;

1. JavaScript inlined in the page - Netscape 4/ IE3 kludgery [Animal House - foodfight scene]
2. Clean html with consistent and meaningful styling and classifications with all JavaScript logic in a couple of separate files which operate on said markup, transforming it, attaching event handlers, et.c. - jQuery [2001 - space shuttle approach to twin-wheel space station].
3. Client-side hierarchical components with modularized logic and templates - Dojo [Basic Instict - the single crotch frame].

First of all I'd like to say that jQuery might have a modularized templating system, where you separate widget markup from logic - I'm not experienced enough in it (yet) to tell. Please comment if you have some good references. Then I'd like to say that both Ext and JavascriptMVC has excellent templating systems for their components - in a very similar vein to what Dojo has (again, AFAIK). But I had to choose one good example out of each group, or at least it felt that way.

What do you gain by writing custom components all the time? It seems awfully complicated, doesn't it?

OK. what do you gain (in Dojo). Let's see;

1. You get an enforced structure that help you separate view and logic inside the widget.
2. You get opinionated support from the framework for automatic id generation, coupling markup elements to widget references and widget lifecycle management (certain named functions get called at certain times).
3. You get _guarantees_ that the widget is hermetically sealed, unless you do something completely stupid. This means you can take it out from one place and put it somewhere else, or change your mind about having three and putting in four instead. No colissions, no overlaps.
4. Widgets markup html snippet templates can themselves (at least in Dojo) contain other generally arbitrary widgets,. Turtles, all the way down, basically.
5. The whole mélange can be expressed in the HTML file that is actually loaded by the user with one (1) div.


But other than that, I suppose, not much.

Since I really only know Dojo, I will be using that in my example. Let's say that I want to have a widget with dynamically generated JavaScript-only 2D charts, where the charts can be dragged and dropped and reordered just like iGoogle or similar pages. Wouldn't that be cool?

Let's start with the target HTML file, which look like this:





I begin by loading some Dojo CSS stuff and the actual toolkit base itself from AOL's CDN (Google would have worked as well, of course). Then I have to configure the djConfig variable a bit so that Dojo find the local files for the widget referenced later, even though it is loaded cross-domain.

The dojo.require statements check if the referenced classes are available, and if not resolves them and load them (since the custom component 'multichart.main' is not part of dojo, I needed to point out where to find it in the modulePaths setting in djConfig earlier.

Then as just one, albeit fat, div tag, I define the draggable multichart container. As you see dojo uses custom HTML properties to let its parser instantiate widgets declared in markup (All widgets can also be created programmatically in the classical style, of course). the dojoType property declare a widget which must be loaded already. All other properties after that are inserted as 'this' properties of the instantiated widget class, if it has declared those names already. Works smoothly, truly.

I basically just pass one argument which is an associative array of names and values. The idea is that the names become title string on the charts, and the values point out urls where json data is provided to generate the charts.

The test file foo.txt looks like this:

{
series:
{
series1: [{x: 1, y: 0.2}, {x: 2, y: 0.5}, {x: 3, y: 1.2}, {x: 4, y: 0.3}],
series2: [{x: 1, y: 0.5}, {x: 2, y: 1.0}, {x: 3, y: 0.9}, {x: 4, y: 1.7}]
}
}

So nothing magic, just a json object with a property 'series' and one or more series of numbers, following the standard way of feeding the dojox.chart API.

But back to the custom widget. Since I declared that Dojo should look for widgets beginning with 'multichart' in the directory of the same name in the directory that the HTML file was loaded from, I place a file called 'main.js' there. Which looks like this;


dojo.provide("multichart.main");

dojo.require("dijit._Templated");
dojo.require("dijit._Widget");

dojo.require("dijit.layout.ContentPane");
dojo.require("dojox.layout.GridContainer");

dojo.require("multichart.chart");

dojo.declare("multichart.main", [ dijit._Widget, dijit._Templated ],
{
templatePath : dojo.moduleUrl("multichart","templates/main.html"),
widgetsInTemplate : true,

content : {}, // name - url pairs to make draggable charts out of. Must be passed from calling script / page
columns : 3, // How many columns we want to have in the GridContainer

constructor: function()
{
console.log("constructor for multichart.main called");
},

postCreate: function()
{
console.log("postCreate for multichart.main called mycontainer is "+this.test);
this.gc = new dojox.layout.GridContainer(
{
nbZones: this.columns,
opacity: 0.7, // For avatars of dragged components
allowAutoScroll: true,
hasResizableColumns: false,
isAutoOrganized : true,
withHandles: true,
acceptTypes: ["multichart.chart"] // This property must always be present, and can take any kind of widget, including your own
}, this.test);
this.gc.startup(); // When creating some dojox widgets programmtaically, you must manually call the startup() function, to make sure it's properly intialized
var i = 0; // Count for which column we'll place each chart in
var p = 0; // Count for which row we'll place a chart in
for(var name in this.content)
{
var url = this.content[name];
var chart = new multichart.chart({name: name, url:url}); // Create a custom chart with the given name and url
console.log("adding chart "+chart+" to zone "+i+", place "+p);
this.gc.addService(chart, i, p); // Add the chart to the GridContainer (This function will be called addChild in the future, for conformance with similar containers.
i++;
if (i > this.columns-1) // Go to next row if we've hit the end of the columns
{
i = 0;
p++;
}
}
}

});

It begins by telling Dojo that it contain the class 'multichart.main', then follow a lot of requiremenets which otherwise would have to be present in the main HTML file. Then follows the widget class declaration.

As you can see it inherits from two superclasses; _Widget which contain the widget subsystem logic and _Templated, which managed html template snippets.

Note the 'content' and 'columns' properties which, since they are defined here, can be passed from a markup declaration(or from a programmatic creation where the arguments gets passed in an object literal as first argument to the constructor). This also gives me a good place to put default values which will stand if not overwritten (such as 'columns').

postCreate gets called when the widget is ready for action, and here is where I usually put most of my init code. What this widget does is to create a Dojo component which mimics an iGoogle page and let the user drag, drop and rearrange all other Dojo widgets inside it. Then it loops over the argument and create any number of yet another custom component, 'multichart.chart', which is also present in the same directory.

Each widget which derive from _Tenplated can also define a html template inside a string or in an external file. I usually use external files when developing, and then use DOjo's offline build system to inline templates, compress, concatenate files inot one, et.c.

If you want to se the widget in action for yourself, you can download it as an archive here. You might also be interested in my upcoming book "Learning Dojo" where it it explained more fully, along with numerous other examples :) OK, I admit it, this whole post was mostly a shameless plug for the book, but partly I wanted to define the playing field a bit as well.

Cheers,
PS

Sunday, November 2, 2008

How to define a front-end developer

Devoting myself (or allowing myself to be dragged away by the carneval in childish glee) to Ajax and JavaScript the last years has changed the way I've looked at the composition of software teams and of projects. Today, the breakdown is glaringly obvious to me. But since it is the result of a kind of private mental jorney what is obvious to me need not be so to someone else.

I'm talking about the emerging role of the front-end developer, of course. Big deal you might say, I know what that guy does, but I'm not completely sure that everyobdy agree on that, hence this post.

In the days of yore, what you did on a computer was program, period.  Then came specializations of different kinds, like a dot painted on an expanding baloon becomes a circle. And still the expansion continues.

The traditional breakdown of duties in web-related projects has been in browser-specific things and server-specific things. That's actually very good, since this breakdown at least in part acknowledges the client as an entity. The role of the front-end (browser) team has been to create and manage the design and looks of the application, whereas the back-end team create and manage _all_ application logic, including client states (open trees, selected tabs, et.c.).

The front-end team was to be called designers, with en emphasis on producing HTML/CSS templates and all the Photoshop magic required for the quiltwork of images needed to give an illusion of a real application.

Later on these designers were asked to do some dynamic scripting to produce effects of various kinds, effectively becoming programmers in a way, even though their original vocation was that of esthetics and visual design. I don't say you can't mix the two,I'm just saying that in my experience the two fields draw two different personalities.

And on the other end, since the server-based frameworks need to slice, pickle and transform the provided templates into something that their server-side templating system can use, they get exposed to a lot of borderline design descisions, which is not something they originally signed up for either.

The result is that there is suddenly a risk that people are required and/or forced to make decisions and produce work for which they have both a minor talent in and doesn't really feel like, personally.

But that is a dicsussion of whether or not to use Thin Server Architecture (or any of its synonyms and special cases), so let's assume that we have a sane architecture with a cleanly separated front-end and back-end (REST or RESTish, preferrably), how do we look for talent in this brave new world?

Looking at ads for front-end developers shows a plethora of confusion. First of all the same kind of job might be defined as a designer, UI expert, Ajax developer, front-end programmer, Web designer, JavaScript programmer or any permutations thereof, many of which can be applied to a true designer role as well.

Looking at the skill set that the front-end developer should have we find that he or she must be a veteran of all major US armed conflicts since WWII, be able to design, build and fly a commercial airliner, have discovered at least three different cures for cancer (of which at least should have received the nobel price in either of Medicine, Physics or Chemistry) and also be the founder of two or more major schools of painting, including cubism.

It seems that people are both a little desperate and a little confused. Let me try to make things clearer.

In todays world, we need to have a good designer. The word designer will only be used in a classical sense, i.e. visual design. Skill-set: HTML/CSS, Photoshop/Gimp/whatnot, Usability, Fonts, Color theory, et.c.  Secondary skills: JavaScript, HTTP, Networking fundamentals, some Server-side language.

Then we have the elusive front-end developer. Skill-set: HTML/CSS, JavaScript, Ajax, , HTTP, Networking (latency, routing, distributed storage and processing, security), REST, XML (to argument against its use), SOAP/WSDL (see last comment). Secondary skills: Usability, some Server-side langauge.

The only changed thing with the skill-sets for the back-end team is that they no longer are forced to understand HTML/CSS and a smattering of JS, since there is no longer any server-side templates to massage.

On of my pet peeves is that there is an architect onboard, but he or she has launched the project without many times even decided on which Ajax toolkit to use. It's comparable to launching a project without having specified which back-end to use (Java? Python? Perhaps SSJS?).

Granted, in many cases the project has _not_ launched and the ad is for someone to come in as a front-end lead for the very prupose of evaulating and recommending technologies to use in the client layer, but there's still enough 'wild card' ads out there to make my hairs stand on end (all three of them).

Ok, rant over. I guess what I want to say is that the world is getting larger and more detailed at the same time, and that if you are searching for talent, you have better know what you are looking for, and take the time to be as specific as you can.

Cheers,
PS

Wednesday, October 29, 2008

Google Summer of Code 2008 [GSoC2008]












One of the three highlights of this young, long-haired gentleman's trip to California was to participate in the GSoC 2008 conference. Earlier this year I had been asked to become a mentor for the Dojo foundation. As you might know Google gives out enormous amounts of money each year to Open Source organization all over the world, earmarked for students so that they can afford to work on a project for their organization during the summer.

This in itself is really cool and generates (deservedly) no small amount of goodwill for Google. Being a mentor I was invited, along with some 200 other people to the conference. At first it felt a bit strange to have an open source conference. I mean, it felt like having a conference on wearing pants. It's a given, more or less. The only times I normally reflect on Open Source licensing is when I see someone walking down the street sporting a nekkid lower half.

Most, if not all of the focus of an Open Source software developer surely lies in the application or service being created. However, when arriving at the conference I suddenly understood the reason for the conference. Naturally the focus was on Open Source Software, and some of the sessions did indeed focus on project management and licensing issues, but the whole point of the conference was to get to meet the best and the brightest developers in the world - or so it felt at least.

Most people don't code at all. Of those who code a fair share, say at least 50%, just code as if it was any other kind of job. No real enthusiasm. Of those who are somewhat enthusiastic, there's maybe one in ten who are _really_ enthusiastic about programming. Of those maybe again 1/10 have the inclination or assertiveness to actually engage themselves in a public Open Source project, and of them not many decide to take on the responsibility of being a mentor (of which I could have made a much better job, actually, but more on that in some later post).

Anyway, those were the guys I met. But before the conference there was a traditional Thai dinner in downtown Mountain View. The way to the restaurant look a bit like this from the train station:

There was a lot of happy confusion, assertive Leslie Hawthorns all over the place and a resounding call to arms for the following beer bash at a nearby establishment built for this very purpose. The volume of all these programmers talking at the same time was positively deafening, despite some truly outrageously tasty local beer. I'm so sorry that I never remember names, but I hope you know who you are! Please mail me if you want to get in contact.

I met the project manager for the NTP project and was surprised to hear that something that is relied on by so many and used daily for a number of purposes is not getting the economic attention it really deserves. I met a lot of guys from Germany working on MoinMoin, and another that worked on Zope (which I have actually tried out). There was Boombox an Open Source project to replace the OS on an iPod (and other things), several ambitious bio-CMS systems (I'm a reformed molecular biologist hobbyist myself :) and tons of other people.

The party went on into the night and I was nearly unable to manage a late night dry martini at the hotel when I got home, where I had a long discussion with a couple from Alaska who liked to travel a lot (he was a geologist and she was a M&A specialist, which gave me a feeling that they were probably able to. Traveling can be exhausting in my opinion, but here were climbers of Kilimanjaro who would not agree :). They were also really nice people) discussing Obama (Yes!), various economic conundrums and how to pronounce Göteborg and whether the city in question resides in Sweden or Denmark (the former).

Sadly, since I had to be back home on Sunday, I had to leave on Saturday, leaving me only with four hours of conference. With a rapid taxi out to MTV I decided to make the most of it, and made a great number of good contacts, some with people who agreed with me that we must have met before, but could not recall when (it's just half a world away, really). Tons of good discussions on Dojo and on implenetations of REST and general information managemenet. Thanks everyone I met. It was a pleasure talking to all of you!

ANd then there was the scale-model (surely?) of the X1 spaceship. And the lebansese buffet, and the juice bars, and the Testing onThe Toilet posters (on the toilet), and the general Googly atmosphere.

Anyway, most of all; Thanks 1.0E6 to Leslie Hawthorn and Chris diBona (and the rest of the crew) for creating a wonderful experience and creative climate for all of us. And for the Google frisbees. I snacthed two to give to the kids when I got home, who were duly impressed and commenced wrecking our living room immediately :)


Cheers,
PS



Tuesday, October 28, 2008

Tech Talk on Thin Server Architecture and Dojo at Google



While I was in the neighborhood, I managed to get invited to do a tech talk at Google's Mountain View offices this last Friday. I tried to describe the drawbacks with traditional server-centric web frameworks and to use Dojo as an example of how to build true client applications which lead to an increased efficiency for the development team and hence to lower production and maintenance costs.

People were in general very nice but razor-sharp and to the point. Very good discussions afterward focusing on build systems and SEO issues.
Here's the slides themselves;



And here is the Youtube video of my presentation as it is;


Alex Martelli was there as well, and had to comment on the size of my computer :) Erk Arvidsson who should have been my presenter (since I was so nervous I jus got up and presented myself as I usually do, sorry Erik!) took some pictures of relative computer sizes between mine and Alex's;

AjaxWorld 2008 impressions III


Trying to catch up with recent events from last week. Above a shot of the comfortable room at the Sainte Claire picturing the authors own small and portable Acer 9920 and a jug of local adult beverage.

One of the talks that I had anticipated most was Chris Keene's WaveMaker presentation where WaveMaker's cloud strategy would be unveiled. Chris made away with a very thorough and Monty Python themed presentation of the state of the cloud space today, with pros and cons of diferent solutions and wit a special focus on Amazon's EC2/S3 offerings.

The reason for the Amazon focus is of cours that WaveMaker is building its own service on top of it. The presentation was unfortunately not so focussed on what WaveMaker's service would look like or work, but on clouds in more general terms. I do hope that they follow some of my suggestions and make the service more of a 'live' development environment than an IDE that happens to be on the web. Time will tell. Sometime in November the site will go live.

WaveMaker is superficially like Smartclient, TIBCO GI and Bungee labs connect (and MS popfly, et.c.)
What separates Wa veMaker from Smartclient, for example, is that Smartclient is entirely neutral towards the kind of backend being used, whereas WaveMaker projects are exportable as stand-alone WAR arhives, to be dropped in any Java-based app-server of your choice. If you have a more mixed environment, Smartclient might be a better choice, but if you run a Java-only shop WaveMaker will mean much quicker deployment.

This is also what separates WaveMaker from Bungee labs (and likes), which push PaaS (Platform As A Service) where they host and manage they produced apps in pretty much the same way Google App Engine does (Except it doesn't come with any IDE). What WaveMaker brings to the table is the possibility of choice (something Chris did emphasize very well) where you can begin hosting your own application, then move it onto (say) an EC2 virtual machine, only to take it back in-house again when the corporate infrastructure or security requirements have changed. It seems that this will only be made simpler with the coming cloud service. We'll wait and see.

Monday, October 27, 2008

AjaxWorld 2008 impressions II


AjaxWorld had nearly all available conference space at the Fairmont Hotel in San José, connected by miles of carpeted classical Hotel vestibules. All talks had ample room for attending listeners, but the amount present varied (as it does) betwee different talks.

One interesting fact was that there was two different talks on server-side JavaScript; One from Jaxer and one from the Phobos project. The Jaxer demo was very good. The presenter Ian Selby made a small REST server implementation while we sat there, database tables and all. Sure, he was well prepared but still. The amount of code was very small. All portions that accessed was database was one line each. That's the way it's supposed to be, really.

I do hope that Jaxer moves away from the server-side templating stuff and instead emphasize the really nice platform and infrastructure they have in place. Cooking DOM JS on the server and pushing the rendered results to the client when JS is unavailable or wrong version is an edge-case at best, and if no JS is present, ten attaching event-handlers won't be possible anway, so in practice (having mulled this over the course of some months) I feel that there is little that a server-side DOM processingcan do that can't be done with plain CSS anyway. Please correct me if I'm wrong :)

The other Server-side JavaScript talk was by Roberto Chinnici who made an excellent presentation of the Phobos SSJS framework. In theory Jaxer and Phobo does pretty much the same thing, but Phobos is much more focussed on language integration. I think that Jaxer has something similar, but OOTB phobos can call and use any Java library or class as well as spawn JavaScript threads. Not hurting whatsoever is a (coming?) port of ActiveRecord from ruby and a low-level implementation of Gears (wich also works with Jaxer, btw).

Something that was mentioned as a feature, jMaki made me perk my ears up. I checked jMaki out briefly a year or so ago and felt pretty certain that it was something tied to JSF, which made it rabidly uniteresting from my point of view. Luckily Greg Murray, the creator of jMaki was also present and could now refute that fact. jMaki is apparently something quite separate which can be used with Java, Ruby, PHP and now also SSJS.

All well and good, surely. But what does it do? Quite. jMaki is a cross-toolkit switchboard. What that means is that you can use a from created in Dojo, attach its events to a table from YUI which readsits data using data abstractions from Ext. Is that hot or what??? In the examples available there is always some pesky server-side component needed with generic sounding names at every turn, but it does seem at his point that jMaki is not dependent ona server-side component and of so, it is pure gold.

I also saw a presentation by Brent Hamby and Geof Hendrey from nextdb.net which takes my thin server architecture one further step: no server architecture! They have a database service which can be used directly from an ajax page with full transport security. The have a model which is similar to Kerberos, where tickets are issues and encryped by the server, which also includes rules of use which must be preserved.

Netxdb.net also have an administrative interface where you can define queries and users, in a very simple manner. If you have an applicationwhere you can put most of the logic on the client and only use a data store + security, it's a very interesting solution.

All talks were recorded, but it will take up to five weeks until we can see them online (including mine), so I'll post the links to that later.

Cheers,
PS

Wednesday, October 22, 2008

AjaxWorld 2008 impressions I

This is the first in a series of post on what I saw, talked about and experienced on the AjaxWorld conference this year in San José. The conference has just ended and I'm sitting in the hotel room after a nice crabcake burger and a glass of wine and will soon head off to Mountain View in search for a SF bookstore.

Anyway.

My impressions of the conference as a whole and of the state of the industry (which one might hope would have a correlation) is that the world is being split in two. Luckily, most people isn't particularly interested in the bad part :) What I mean by that is that a seizable portion of the talks and booths revolved around products that continue to pile complexity upon complexity on the developers to shield them from the browser.

I'm primarily talking about Oracle and Icefaces here, of course. But also talks on GWT and PHP-generated server-side templates for JS did their best to muddle the waters. Of course, this is not really fair in my part, since everyone is trying their best to solve the problems as best they can. If it so happens that someone has struck upon a substandard architecture and haven't had time to think through all the angles, it's certainly not because of malign intent.

On the whole I was happily surprised to see a lot of companies present whose products and talks were attracting a lot of attention, who were in essence advocating thin server solutions across the border, wholesale. Smartclient from isomorphic has grown a _lot_ since I checked it out a year or so ago.

They have basically built 50% of Dojo all by themselves, for their client side framework, and I couldn't help but thinking: Agnostics what a waste! How much time it must have taken them. And then immediately: Wow, that's really smart, we should have that in Dojo :) What I was most impressed with was their efficient focus in metadata. Nowadays we have JSON-Schema, which was not a luxury they had when they started out, of course. They have a tool which let you import schemas of different formats, though, so JSON-Schema support is probably on their radar.

Why is schemas so important? Well, they have many (all?) components schema-aware, so that if you attach a schema to a Form component, up pops a form with the correct fields, date fields have a datepicker, int fields have validations for integer values and so on. Also, the form gets generated by itself, of course, and can be remaking itself dynamically. Why don't we have such a thing in Dojo? Actually, there's no reason at all, but right now I'm blogging. Also their grids work in the same way, and have a couple of extra features like sort fields which can be enabled for any column (filters).

My only gripe with Smartclient is that the web-based IDE is non-free (but they have a 60-day demo), and also that they have written a whole new toolkit instead of leveraging jQuery, Dojo, Ext or something. Their toolkit is LGPL and all, but still think that it would have been simpler if they hadn't been so monolithic. Now, I _am_ just grumbling. the Smartclient DIE was absolutely wonderful with tons of smart components and a clean separation o f concern between the client being created and the services it consumes. Very good.

UPDATE:
As @ckendrik points out in the comments below, when the first versions of Smartclient were created, no toolkits were available (2001), which explains both the long list of features and the reason for a homegrown solution. I'm not at all suggesting that Smartclient is fracturing the framework space. It is actually very uncommon for companies with similar products to actually leverage an existing framework; Appcelerator, TIBCO, et.c. have their own frameworks as well. This is very much alleviated by the fact that many companies are starting to provide bridges to other frameworks; Smartclient have extension for GWT interaction, for example.

Coming up: " jMaki - buried treasure", "WaveMakers upcoming massive cloud gambit", "Lessons learned around Mars" and finally "SSJS - the only way forward".

Your trusted uncle in San José,
PS

Wednesday, October 15, 2008

Codebits in Portugal


My presentation for Codebits in Lissabon, Portugal just came up. I hope not too many people go to both AjaxWorld and Codebits this year :)
http://codebits.sapo.pt/intra/s/speaker/10

Thanks 1.0E6 for Mário Valente for recommending me as a speaker. I haven't been in Portugal for over ten years!

Cheers,
PS

Academic exercises

A couple of weeks ago I was invited to speak for Bleking Tekniska Högskola (BTH) here in Sweden on the subject of Thin Server Architecture, which was something I just couldn't turn down, even though I'm in deep in several other things at the moment.

At my suggestion, I also made a couple of slides with exercises for the students in basic Dojo widgetry. Then I rememberd how confused I had been when I began to learn Dojo and the peculiarities of JavaScript like closures and Object Literals and such things, so I had to do some quick slides on those as well.

A lot o thanks goes out also to Stuart Langridge, whose presentation "secret of javascript closures" I was able to use, instead of writing my own :) It saved a lot of time.

All in all I was surprised in how many of the students that understood what I said, that was able to (or wanted to) follow the exercises and came up with good questions, and also had already seen the crockford vids on 'advanced javascript'. These guys were really up to speed! OK, they had only worked with jQuery/Mootools concatenation style toolkits before, so the relatively 'classic' coding style of Dojo was something new.

What I also realized is how truly huge Dojo is, when I tried to do a quick feature rundown. The 2D cross-browser Gfx, the charting support, fx and easings. Especially the animations was fun to show, as they're really completely generic and can be used to morph from one CSS class to another, or to change color, or to move an element on the page, according to the easing rules which determine how fast or slow to change the values over time. That's really slick.

Thanks to all the students (and teacher) for having me, and having tried this out I can say that I'm now officially prepared to teach at other places as well :)

Cheers,
PS

Tuesday, October 7, 2008

Conferences come in groups


I'm trying to find time each day to finish my presentation for AjaxWorld, which I'll present the 20th this month in San José. I've got permission from Kris Zyp from Sitepen and Ganesh Prasad of SOFEA fame to loan their pictures for various slides, so there's hope that I'll have something more than text in it!

The title of the presentation is "Practical Thin Server Architecture with Dojo" and I hope to live up to the title, even though I won't be doing more than talk and dance.

I spoke to Leslie hawthorn at Google when I visited the Open Source Offices (and got a spanking nice T-shirt :) this summer about doing a tech talk (apparently called 'Google MTV' internally) on the subject of Thin Server Architecture, which I just got confirmed. It's the 24th October in the middle of the raging Summer of Code unconference (which I'll also hopefully attend).

That's two talks of the same kind the same week, which is OK. But what happened two weeks ago was that I got a mail from Mattias Schertell at Blekinge Tekniska Högskola (A technical unversity in Southern Sweden) which wondered if I could do a day of workshops and talks around the subjects (TSA and Dojo), which I accepted, and I'll fly there Tuesday next week. I have to remake a couple of exercises I've been writing a while ago this weekend, to get them up to Dojo 1.2 speed. Mental not to self and all that.

Then a friend asked if I couldn't speak at a Swedish conference (on Ajax, et.c.) also next week, the 16th, which I could, so that's another engagement right there. There's another conference by the same organization in early 2009 as well, so...

Anyway, completely unconnected comes an email from Pedro, a Portuguese organizer for the codebits 2008 conference in November this year which also invites me to speak, and how can I say no to that!

So all in all it seems that my future is filled with glaring lights, repetitions and pointed questions :)

On the plus side, my next chapter to be rewritten in response to proofreading for the book is due as far out as October 19, which feel really far off at the moment. I've managed to do some coding on World Change Network, but still hope to get the new menu working properly and to have the CRUD tables skinned to spec.



While evaluating WaveMaker 4 for my client, I tried to create a custom component that wrapped dojox.charting which I managed (with some help from WM personell - thx!) to accept complex json objects as data series input and switch both chart type and color theme on the fly. Pretty neat actually, and not so hard as I thought. It's open source as well, so mail me if you want it!

Cheers,
PS

Saturday, September 20, 2008

Return of the WaveMaker

I had reason to pick up WaveMaker again recently for a customer project. I knew that a version 4 had been released and had been meaning to check it out for a while, but I had a lot of other things calling my attention all the time.

As you might remember WaveMaker is a web-based IDE which let you drag and drop a nice UI and connect tables, trees and other things to backend services, like databases, web services and even POJO Java classes. And this in itself is nice, but the big deal is that everything is done with very few clicks in the IDE, it's all running from your own local machine (if you want, you might have a separate development machine for this, but it's in your control, is what I mean) and it produces a stand-alone WAR file that you can drop into JBoss or Tomcat or whatever to get your stuff up and running.

I really like RAD, when it works. WaveMaker had some problems in the 3.x release in making connections between UI components and backend services easy to understand, and this has thankfully become much better in the 4.x release.

Still, WaveMaker has two major drabacks, for me;

1) It has no cloud. OK, this is wildly unfair, I know. But I'm comparing against GAE, and possible Jaxer all the time. What hapens if I manage to build something that has to support millions of request? Do I have to wrestle AWS again? Argh!

2) It requires Java to runt the backend. Custom logic can thankfully be done in any language, as long as it exposed as any (almost) kind of web service, WSDL, RESTish, et.c, but the server is done in Java, and require a container. What would be very nice is if you could separate the excellent front-end IDE from the services it depends on from the backend, and build alternate backends. Maybe a python backend? ?? ?? :)

I've been looking at ways to use WaveMaker with SSJS (Server-Side JavaScript) the last days, on my off time, and it is definitely doable. I donwloaded the Rhino jar from Mozilla, which is the standard Java-based JavaScript engine (http://www.mozilla.org/rhino/download.html), and put it in the lib directory in my WaveMaker project folder. Then I copied and pasted one of the examples which comes along with Rhino that shows how to evaluate a Java String as JavaScript, into the WaveMaker Wizard which help you create new services from a simple POJO. And it compiled and smiled, just like that :)

The next step is to make it do something useful, and to tie the output of the service to an UI component, maybe a textfield, to see that it works. And what we have then is a DIY SSJS in conjunction with WaveMaker's excellent frontend. Since WM has a well defined server-side Java ASPI as well, exposing those objects to JavaScript should be fairly easy, since you'd probably want to interact with it at some point.

Then I got thinking about possible backend alternatives again, and noticed that Kris Zyp's SSJS persevere framework has been accepted into the Dojo foundation. Persevere is really cool, but has no IDE as such, and depend on the developer knowing well how to consume RESTish webs ervices from a JavaScript(Ajax client, very close to TSA/SOFEA ideals :)

What it has in spades is database rubberification and exposal through REST services. It can connect to a number of different databases, and present all actions (even schema modifications and table management) in the same kind of REST metaphor as you would present adding a new row or modfying a column.

Persevere has a lot of feature overlap with WaveMakers backend, but that's really very positive, since there's not so much more to add (I hope :). Unfortunately, Persevere also require a Java backend, running on Rhino itself, but it could still be a nice addition.

Possibly, WaveMaker could also add a SMD (Simple Method Descripton) web service generation to the list of existing services it can consume, which would make Persevere melt right in.

What I would really want is a SSJS cloud which is toolkit agnostic, with the WaveMaker IDE as ... well, IDE and management console. The current contenders for the SSJS cloud throne are mostly (10gen, for example) forcing their own platform, and making it impossible to import JavaScript from other sources (Say, Dojo). Or if not, they force you to whip out your credit-card before getting any access (Jaxer).

I think it's really sad that Jaxer does not provide a free entry-level poking-around alternative for their cloud. Google App Engine (OK, quite some more resources) and others can do it, and it really feels like a bad slice of an otherwise gorgeous apple pie.

I began with praising WaveMaker, then being unreasonable with them, then yammered at the state of SSJS in general. Coherent? Certainly not! :) Also, I really like Jaxer as well, but I think that they really should not push away customers like that. Everything else point in the other direction(I.e. really cool and very interesting), so not giving people the opportunity to check out the wares before ponying up (The amount is insignificant, actually, from this point of view) feels wrong.

Cheers,
PS

Tuesday, August 26, 2008

Strange summer and other things

What did you do on your summer vacation? Mine was kind of hectic. I came in contact with a group of people who was in the process of creating a startup and was searching for a front-end lead. The idea sounded strange but I was wowed by the talent that sat at the table. We were people from all around the world; Sweden (obviously :), Pakistan, France, Canada and USA.

I had a complete blast with designing the front-end stuff and interacting and brainstorm with all the other guys and girls on the team. Wives were involved (under NDA's naturally) and ideas spawned at any hour of the days - dinner tables, while jogging - you name it.

Unfortunately I got more and more restless with what I felt was something that would not solve itself automatically; the end-user proposition, especially in light of similar services and/or services filling similar purposes. I might be wrong, but I feel that the end-user experience and value proposition must be central to a service, and by definition that includes its relation to its context (i.e. similarity neighborhood). I was so at odds with the rest of the team that I finally decided to quit rather than get into debates all the time.

This was not an easy decision, but in retrospect I still feel, the best one. It feels very sad to become part of a dynamic supersmart group fro a month and then willingly close the door on it, but that's what I did. OK, I'm still involved in some side-projects that are indirectly needed, but I'm not 'in' any longer.

So, that was quite a ride, and I'm still reeling from it. The positive side is that I now have more time on my hands (i.e. almost none) than before, so that finishing my book at least become a possibility. That is a good thing.

I'm also working on two things that is on my conscience;

1) Get back to finishing the fine editor for JavaScript as a Dojo component, so that I can finish my Bunkai editor so I can deliver it this decade to the Sling project (and any other project needing a rowser-based source code editor with pluggable resource managers.

2) Finish the simple Dojo / App Engine mockup for World Change Network before taking the family to dinner with the founder this Friday :) Might be a priority, really.

Other than that Dojo 1.2 is creeping ominously closer to being released (no thanks to me, I might add), and if you haven't checked out the truly cool new features, just look at the following;

Just-what-you-always-wanted DOM text effects; explosiong, slicings, you name it!

A eclipse/Croquet-sqeuak object browser / drill-down list. MIGHTILY useful


Insanely cool dynamic 2D charts with events, sliding pies and onmousovers with quivering data points. Mmmm..

And of course a lot more, but these are my personal favorites. The last two come from my personal stash, a cross-domain built svn 1.2, reasonably updated. Feel free to use it if you want to play with 1.2 but don't want to wait for AOL or Dojo to put up a build. Use same instructions as for AOL.

Continuing the Dojo trend, make sure to take a look at WaveMaker's latest release (with surfing board and everything :). As you all know WaveMaker is an open-source web-based IDE that let you create Ajaxified applications point-and-click style, with a WAR-file as result. Simple component bindings to SQL tables and/or Web Services as well.

Really cool, actually. If you're into Java, that is. I wonder what would happen if they took that front-end and put it on rails or App engine?? Hmm...

And I'm also gearing up for talking at AjaxWorld this October. I really hope that Red (at the Google Open Source programs office) will find a spot to do a tech talk that week, since it's going to be a summer of code party then as well.

Anyway, if you're in San José in October 20-25, please ping me for a beer ot two :)

Oh! And don't miss Kris Zyp's devastatingly spot-on article on client / server programming on the web.

Cheers,
PS

Monday, June 23, 2008

Twitter and Gears


OK, this is probably tangential to the book I should be writing on right now (20 pages to go until Sunday. Argh!), but it _is_ Dojo.

Twitter is more like a kind of weather than a service. It goes well at times, and at times it does not. Many times I wish I just could search through old tweets, or sort them quickly in some ways not intended by twitter.

I've actually thought; why not use Gears to store all twits and then just use a Dojo Grid to show them. Separating download and presentation in a perceivable way could actually be a feature of a twitter client.

Most probably someone has already done this and much better than I, but anyway, I've actually made it work .. sort of :)

So if anyone know if this has already been done, please mail me or comment here. Once the stuff is in the google gears store I thought I'd add a MIT Simile timeline to splunk it up a bit as well.

Cheers,
PS

Tuesday, June 3, 2008

From Google with Love

After a grueling 17 hour flight from Stockholm, bouncing once in Chicago, we (me and my family) arrived at SFO and shortly after, Hotel Triton at Grant Avenue and Bush. The kids started hopping around between the beds like grasshoppers while the poor parents attempted some sort of controlled landing. The room was very nice, if a bit on the small side.

After a couple of obligatory days of spending on the obvious tourist activities (having a great time all the same, especially the kids. I really recommend SF as a place to go for vacation) I went to the Dojo Dinner that was arranged the day before Google I/O.


It was really cool to meet people I've only had e-mail contact with before, like Alex Russel and Dylan Schieman, among others. We had a great Vietnamese dinner which sort of turned into a vehicle for party-crashing the Wordpress release party a few blocks away. At the Dinner were also three really nice Dojo specialists all the way from Bògota (Sorry, I have to type this in at lunch and don't remember everyones names atm).





Nice room (The Pókemon pillow was ours).

Kids grasshoppering.

Having acquired the secret password for the WordPress bar left me in a lot of interesting conversations, listening in on the FLickr guy, the Wordpress guy and also Rohit Khare who had something new up his sleeve, more of which probably later.

Next morning it was time to get to the Moscone center for some serious Googling. I was at the right place sort of) with twenty minutes to spare. This was Moscone center;

Hmm. No Google logos there, eh? OK, this was a really _big_ affair. The entrance might just as well be on the other side, right? I started to walk around the center, keeping a brisk pace, looking out for sign of any convention, Google or no.

As it turned out, the next side of the building was clearly a 'side' with no real official entrances, turning up the hopes for the 'other end', as it were.

Much to my chagrin, the other side was just that, leaving me in an even worse spot. Each side of the building was perhaps 400-500 meters, and the only thing I could do now was to go back to the side where I started. There again I noticed that the building _way_ over the other side of the street also had "Moscone" signs. After managing the street, I finally got hold of a person who directed me to yet another building, Moscone "West", which was barely visible from where we were standing. It was pure luck Google had not set up shop in Moscone South, Egypt.

I was not alone in attending Google I/O;


The keynote was good, except for the GWT guys trying to bash on JavaScript and only managing an empty clapping sound. It's about 55 minutes in. I have liked GWT a bit on general principles before and I still think it is a great solution. Unfortunately it is a solution to a problem the doesn't exist.

Then I went to get an introduction to Python by Guide himself! Is that cool or what? He's like me European, well built (...), humorous and wonderfully opinionated. Most things I've read up on before, but he got in a lot of history, examples, et.c. between the lines so it was really great.

Also, I managed to meet up with Kirk Wilson, CEO of World Change Network, to plan the next release of the system. The general idea is to create a reputation-based project store and exchange, focusing on solving problems for people in developing countries. The reason for this is that there might in many cases exist tried-out processes for certain problems (donating blood, managing around corrupt officials, seeking legal help, et.c.) that the people in need of them are not aware about.

Also, the system will contain basic project management, supporting coaching and helping from all of over the world. I think of it as aid by donating project management rather than money. I'm sure Kirk can put it more eloquently, but that's a fair summary from my point of view, I suppose :)

The party was a smash hit of Olympic proportions. Dozens of linked arcade games, chocolate fountains, Googlers somersaulting in dragged together heaps of bean bags, beer, wine and ale of exotic brands (and Steam ale, yes :), Sushi, pastas, mashed potatoes and gravy.. and the flight of the Concords. I didn't think I would find them funny, but they were. They were also very good musicians.

I think I went to an English pub after that, with some people in the real-estate business (those cards left at home again, sorry!) with Swedish ancestries.

The next day went by in a blur, finally coalescing into the acquisition of a Google Gears Beta T-shirt! Yay! And also the devastating wit and vehemence of Steve Yegge.



He gave a good rundown on his server-side (SSJS) JavaScript work at Google, and on his plans to release the Ruby on Rails in JavaScript framework Rhino's Not Ruby (RnR) Real Soon Now. For all those present who still hasn't got it, he gave ample examples and conjecture for why dynamic beats static when it comes to choosing language, and why late binding and JIT's beats the living daylight out of static compiling. In all a wonderful talk, if too short.

If you read this, Steve, remember that you've _almost_ promised to come to Stockholm and give a (longer) speech later this year :) Just so you don't forget.

After Google I/O wound down, Kirk was the perfect host, giving us a truly great tour of San Francisco; the Golden gate, the Yacht club, the park, height-ashbury, et.c. After we very deposited, light as feathers in the restaurant Viva!, recommended by Mats Henricsson, devouring the Pescatore pasta and then enjoying a quiet walk home to the hotel through Chinatown.

Being a mentor for Dojo in the Google Summer of Code 2008 program, I was invited to a lunch at the googleplex by the indefatigable Leslie Hawthorn, of Google Open Source Programs Office. I must really thank her for being able to acquiesce to my request to bring the whole family. They actually had T-shirts in the kids sizes.


When we got home to Sweden again, we found out that our daughter had been collecting gravel from the path outside building 43 and put in the jacket pocket of my wife, so now we have four very precious Google Stones at home, possibly with the same effects upon basic charateristics as AD&D IOUN stones.


Also, the food was indeed great, and Leslie had this great T-shirt in commemoration of the day when they had the author of xkcd on a Tech Talk. They had baked him a cake the shape of the internet! :)