My time at Talis Systems officially ended last week. I joined the team during painful times, but I'm glad (and proud) to have been a Talisian at least for one year. I have had a few freelance gigs with Talis before, but being part of the team was a whole different thing. And I could frequently travel to the UK, immprooff my inklish, and discover the nice city of Birmingham. There's a reason why they have that G in GB.
Work-wise, I probably learned more in the last 12 months than during the previous 5 years combined - hat tip to Julian, Leigh and all the other (Ex-)Talis folks. And much of that goes beyond just technical skills. I don't want to bore you, but you can definitely learn a lot about your path through life when you get the opportunity to look at it from a different perspective. Apparently, I first had to become an employee working in a foreign city to see the bigger picture around why I boarded that Semantic Web roller coaster in the first place and where it overlaps with my own ideas and interests.
So I am going back to self-employment. And I am also going to stay in the emerging Data Web market. But I'll approach some things differently this time.
First, change of attitude. To contribute in a personally more healthy way again. I won't argue about technical details and specifications any more. That just turns me into a grumpy person (belated apologies). I doubt that promoting products by advertising their underlying technologies is the best way for establishing and growing a market anyway. That's like trying to heat a room by just burning a lot of matches. Promising, with renewed anticipation after each match, but useless without some larger fire in the end. I would like to help spark off these larger fires. Without constantly burning my fingers (OK, enough fire imagery ;-).
The second change is related, and it is about focus. While I still see many people using the ARC2 toolkit, I have had more encouraging feedback and signs of demand recently around my work for end users (including app developers, in a sense). So my new mission is to improve "information interaction" on the Web, and I'll be offering services in that area.
And it looks like I'm off to a good start. I am already fully booked for the next months.
Posts tagged with: talis
Moving forward back to Self-Employment
I'm self-employed again after an inspiring year at Talis.
Posted on 2012-09-19 at 16:40 UTC
by trackback)
(I'm joining Talis!
I'll start working for Talis' Kasabi team

So I'm very happy to say that I'm going to become part of the Kasabi data marketplace team in September where I'll help create and drupalise data management and data market tools.

Can't wait to start!
Posted on 2011-08-18 at 09:00 UTC
by trackback)
(Paggr article in Nodalities Magazine 6
The latest NodMag issue features an article about Paggr.
Talis' new Nodalities Mag is now available online (and the print version is on its way to subscribers). This issue contains six semantic web articles, including one about Paggr:
- Linking Data and Semantics at O'Reilly - Gavin Carothers and Charles Greer tell O'Reilly Media's Linked Data story.
- Discovering SPARQL - Alex Tucker exposes SPARQL endpoints via Bonjour.
- Linked Data In(ter)Action - Benjamin Nowack discusses Paggr.
- Introducing: STI International
- Social Semantic Web Scales in the Cloud - Simon Schenk discusses SemaPlorer
- Streams, Pools and Reservoirs - Leigh Dodds explores flowing data
Posted on 2009-04-29 at 09:30 UTC
by trackback)
(Major ARC revision: Talis platform-alignment, Remote Store, SPARQLScript
The latest ARC revision is aligned with Talis' platform structures, got a Remote Store component, and the start of a SPARQLScript implementation
The latest ARC release comes with a couple of non-trivial (but also not necessarily obvious) changes. The most significant (as it involves ARC's resource indexes) is the alignment with the structures used by the Talis platform. ARC's parser output and PHP or JSON formats are now directly processable by Talis' platform tools. The documentation has been updated already, you may have to adjust your code (basically just "s/val/value/" and "s/dt/datatype/") in a few places.
The second major addition is a Remote Store component (documentation still to come) that is inspired and based on Morten Frederiksen's great RemoteEndpointPlugin. The Remote Store works like Morten's Plugin, but supports SPARQL+' LOAD, INSERT, and DELETE (i.e. write/POST) operations.
The third addition is also the reason why the Remote Store (which can be used as a SPARQL Endpoint Proxy) became a core component. I've worked on a draft for a SPARQL-based scripting language during the last months, and the latest ARC revision includes an early SPARQLScript parser and a SPARQLScript processor that can run a set of routines against remote SPARQL endpoints. What's still missing before this stuff becomes more usable (apart from documentation ;) is output templating and some other essential features such as loops. I do have an early prototype running in a local SPARQLBot version, but I probably won't have it online in time for tomorrow's Semantic Scripting Workshop (that I'll try to attend remotely at least). This is really powerful (and fun) stuff that will be available soon-ish. Can't wait to replace my hard-coded inferencer with a set of easily pluggable SPARQLScript procedures.
Other tweaks and changes include a very early hCalendar extractor and a couple of bug fixes that were reported by (among others) the SMOB project maintainers.
As usual, thanks to all who sent in patches, bug reports, feature requests, and stress-tested ARC. I think we're pretty close to a release candidate now :-)
The second major addition is a Remote Store component (documentation still to come) that is inspired and based on Morten Frederiksen's great RemoteEndpointPlugin. The Remote Store works like Morten's Plugin, but supports SPARQL+' LOAD, INSERT, and DELETE (i.e. write/POST) operations.
The third addition is also the reason why the Remote Store (which can be used as a SPARQL Endpoint Proxy) became a core component. I've worked on a draft for a SPARQL-based scripting language during the last months, and the latest ARC revision includes an early SPARQLScript parser and a SPARQLScript processor that can run a set of routines against remote SPARQL endpoints. What's still missing before this stuff becomes more usable (apart from documentation ;) is output templating and some other essential features such as loops. I do have an early prototype running in a local SPARQLBot version, but I probably won't have it online in time for tomorrow's Semantic Scripting Workshop (that I'll try to attend remotely at least). This is really powerful (and fun) stuff that will be available soon-ish. Can't wait to replace my hard-coded inferencer with a set of easily pluggable SPARQLScript procedures.
Other tweaks and changes include a very early hCalendar extractor and a couple of bug fixes that were reported by (among others) the SMOB project maintainers.
As usual, thanks to all who sent in patches, bug reports, feature requests, and stress-tested ARC. I think we're pretty close to a release candidate now :-)
Posted on 2008-06-01 at 17:50 UTC
by trackback)
(New ARC2 plugins
Keith Alexander is an ARC2 plugin factory
If there was a "most productive SemWeb coder" category in Danny's "This Week's Semantic Web", this week's turn would probably be Keith Alexander's. Last week, he provided no fewer than three ARC2 Plugins:
While at it, he also implemented a SPARQL+ wrapper for Talis Platform stores.
I think I blogged about Morten's RemoteEndpoint plugin a while back (this one should really become part of the core codebase), but did I mention Peter Krantz' File System Synchronizer? It keeps an RDF Store in sync with a file system directory which enables a really nice option to implement larger RDF editing systems on top of ARC: By using editing tools that work with small RDF files (quick response times and everything) and his plugin, it becomes possible to provide rich query functionality over the whole dataset without the store getting in the way of the publishing tools. RDF index rebuilding can be slow, de-coupling read from write operations and introducing an asynchronous update process is a nice solution.
Awesome stuff.
- An RDFa Serializer
- Utilities for working with ARC structures (e.g. filter, merge, or diff)
- A SPARQL (Re-)Serializer (very handy for checking/adjusting SPARQL queries before they are passed to the store)
While at it, he also implemented a SPARQL+ wrapper for Talis Platform stores.
I think I blogged about Morten's RemoteEndpoint plugin a while back (this one should really become part of the core codebase), but did I mention Peter Krantz' File System Synchronizer? It keeps an RDF Store in sync with a file system directory which enables a really nice option to implement larger RDF editing systems on top of ARC: By using editing tools that work with small RDF files (quick response times and everything) and his plugin, it becomes possible to provide rich query functionality over the whole dataset without the store getting in the way of the publishing tools. RDF index rebuilding can be slow, de-coupling read from write operations and introducing an asynchronous update process is a nice solution.
Awesome stuff.
Posted on 2008-03-31 at 17:00 UTC
by trackback)
(