Bachelor Thesis!

In the last semester at the IT University I wrote my Bachelor Thesis together with Peter Thorin. We evaluated the maturity of a Microsoft tool for model driven development called Microsoft Domain-Specific Languge Tools (DSL Tools).

The work included building a MetaModel for Service-Oriented Architectures (SOA) using DSL Tools and compare the results, both in the time it took to develop the MetaModel and the number of elements in our language, with existing studies on similar techniques (UML MetaModel extensions and UML Profiles).

Since no definition for SOA existed a literature review was conducted as part of the work. This resulted in a requirement specification for what elements that should be included in our SOA based on in how many different studies an element was mentioned.

The work was very interesting since no previous work on DSL Tools were available and that DSL Tools will become an important technique when developing software.

The work resulted in an article; Bachelors article: Evaluating Microsoft Domain-Specific Language Tools – an Empirical Study of Domain-Specific Languages and Service-Oriented Architecture. The study was rewarded with an A.

Video from the presentation

Update 071129: This paper has been accepted to the SERPS conference! Read more on the Göteborg University web site.

Semester 5: Project Socio

Socio Icon The fifth semester (at the IT University) I was part of a six person team that analyzed an existing application written for Mac OS X (in Objective-C), with the ambitions to reuse as much as possible of the application, though changing requirements for it.

The first requirement added, was that the new application (Socio) should be available on other platforms than Mac OS X. This requirement was achieved by building the new code in C# / Mono and embedding both the Mono runtime and the Objective-C runtime in a C application.

The second requirement added was to make the current monolith application extendable. This requirement was solved by creating a Plug-in architecture where the whole application runs as different plug-ins for the complete functionality.

The first thing that was done was an analysis of the current application followed by the creation of a requirement specification and architecture for this application (no requirements or architecture documents were available). Then we added the two requirements and created a new architecture that was implemented.

The hardest part in this project was to understand, learn and make all different techniques work together.

My roles in this project were as architect and developer.

The following artifacts were produced during the project:

  • Article – An Evaluation of Porting and Reuse Activities
  • Requirements
  • Architecture
  • Design
  • And a number of other documents

Download all documentation from the Socio project (16 MB)

Development for this project is continued; see the Socio web site and the Sharp Wired web site for more information.

Project SEAGRID – Done!

The forth semester’s project (at the IT University), SEAGRID, was aimed to create a simulation of an automatic container harbor. The project was a mid size project with about 50 team members. To organize the project all team members where split into nine different teams, all with its own responsibilities.

I had two roles in SEAGRID where the first was as a team leader for the project team responsible for the network connection between all different nodes in the system. My other role was as a developer on parts of the network code. There are much learnings from being in such a big project, of which the most obvious is to realize how much work information and knowledge synchronization between different teams requires. Another learning is that the overall architecture and requirements affect how effective the organization works. A clear architecture is crucial to split the work load between different teams as early as possible.

Reports and source code from SEAGRID is available below:

I worked in the distributed systems team. See chapter 4 in the report for more information.

Project Rumorize; 3 down – 3 to go!

Rumorize IconThe third semester at the IT University we had the idea to create a system that should search various discussion forums and other websites to assemble similar information. This was done by creating an intelligent agent that searched the web sites, grouped the data and put the information in a database. The data could then be presented on web sites or other medias.

My roles in this project was to ensure the quality of the documents produced as well as developing the agents and matching algorithms. All implementation was done in Erlang.

The following documentation was produced:

  • Requirement Documentation
  • Design Specification
  • Source Code
  • Rumorize! web site
  • A number of other artifacts

The data in the version of the web site above is based on about 500 000 posts in various Linux support forums. Note! The data on the Rumorize! web site is not updated since January 2005.

Download: All Documentation from the Rumorize project!