Category Archives: Technology

3rd Annual Ja-Sakai Conference Report

I was very pleased to join Lois Brooks at the 3rd Annual Ja-Sakai Conference (translated) hosted by Kumamoto University. The day prior to the conference Ryuichi Matsuba asked me to provide a presentation to his colleagues regarding how Indiana University’s software developments are organized and the various roles that team members play (slides link). I only was able to finish about one half of the presentation due to time constraints, but we did have some good discussion on related topics. I was very happy that Lois Brooks was able to participate in the discussion as she was able to field some of the questions I could not address directly. Thanks Lois!

Afterwards, Professor Makoto Miyazaki of Kumamoto University provided a sneak-peek of his Ja-Sakai presentation (kindly translated into English) detailing their ePortfolio development efforts which include:

  1. uPortal as the primary landing page for students. This provides the launch points for accessing both of their WebCT CE6 and Sakai 2.6 OSP instances.
  2. Student artifact submissions via the natural WebCT user interfaces.
  3. Automated batch transferal of the artifacts over to Sakai OSP Matrix for evaluation.
  4. A newly developed OSP tool which they call “Notifications” which is a type of dashboard that allows users to easily see pending work and links directly to the tools for completion.

This work is in support of their competency-based curriculum and will be presented at the 2010 Sakai Annual Conference in Denver, Colorado. For more information about Kumamoto University’s portfolio practices, see:

The following day was the conference which started with some very kind opening remarks from Dr. Shin-ichi Abe, Vice President and Trustee Kumamoto University and Dr. Takeshi Mase, Nagoya University Graduate School of Information Science. After the opening remarks, Lois Brooks provided a very nice presentation titled “Technology and Learning”.

Following Lois’ presentation, I delivered “Sakai 3 and the next major web technologies”:

The remainder of the conference was conducted in Japanese, so Lois and I were kindly escorted by Tomomi Nagata and Yuuki Tsunemoto to a vegetarian lunch with twelve different types of tofu and a traditional Japanese tea ceremony at the local Samurai house. We then finished our afternoon at Kumamoto castle before returning to the conference center for the Ja-Sakai reception.  Some key takeaways for me from the reception:

  1. Nagoya University has officially announced their plans to migrate from WebCT to Sakai. The other universities are watching very closely.
  2. Kansai University, the largest private university in Japan, has adopted Sakai in partnership with a commercial entity (NS Solutions?). Their biggest issue with Sakai is the lack of custom workflows and were very excited to see the work coming out of Sakai 3 to help resolve this issue.
  3. Learning Java is presenting a problem to Sakai related development and lowering this barrier to entry is an important concern (e.g. think PHP developers). Much of Sakai development is occurring at the edge in Japanese Universities.
  4. Japan now has at least two commercial partners that are active in the Sakai space.
  5. Internationalization and localization of both Sakai the software and Sakai the website continue to be a hurdle for Japan.

I would like to convey many special thanks to our very kind and generous hosts from Kumamoto:

  • Dr. Ryuichi Matsuba
  • Dr. Shin-ichiro Kubota
  • Dr. Hiroshi Nakano
  • Ms. Tomomi Nagata
  • Ms. Yuuki Tsunemoto

PS – Ryuichi Matsuba made good on his promise to Sakai last year and unveiled two new Sakaigers at the conference this year. As you can imagine, the Japanese find the Sakaiger very endearing and kawaii!

PPS – I was a bit overwhelmed at how many QR Codes I saw in Japan – they are everywhere! Maybe what piqued my curiosity the most was the fact that the promotional materials for the conference (both print and web) contained QR codes. I thought I remembered Google starting to promote QR codes with Google maps, and that was indeed correct. I have now equipped my iPhone with a QR reader called QuickMark QR Code Reader and now I am able to read QR codes. I am using the QR Code Generator from the ZXing Project to generate QR codes. It looks like this technology is on the verge of adoption in the US and we should encourage it. Scanning a QR code with your phone sure beats typing a URL on a mobile device! I have also seen these displayed on business cards for a quick way to get a new contact into your address book. Very cool!


1 Comment

Filed under Education, Sakai, Technology

Status Update on Basic LTI Widget for Sakai 3

I am returning from a very productive trip to Japan to participate in the 3rd Annual Ja-Sakai Conference (which I will blog about shortly) – but I did want to provide a quick update on the progress I have made in developing the Sakai 3 Basic LTI consumer widget. In my previous post on this subject, Sakai 3 Basic LTI Widget Sprint, I outlined five action items:

  1. At least one screencast demonstrating the work to date – probably two. One showing the Basic LTI plumbing and passing of the LTI certification tests, and another showing Sakai2 tool placements in Sakai 3 sites.
  2. Some of the information in an LTI launch needs to be secured from end users. This will be a chance for me to learn more about access control in Sakai 3; see: KERN-591.
  3. The widget itself needs some renovation. I will likely copy the existing Remote Content widget and add the LTI settings.
  4. Create some virtual tools for existing Sakai 2 tools and add these to the out of box Sakai 3 experience.
  5. Work on the Sakai 2 Basic LTI provider servlet to make it more robust and support the specific Sakai 2/3 integration use cases.

So let me address these in order:

  1. I still  do not have any screencasts to show. I have made some significant progress on #3 so I am waiting to complete the UI overhaul before recording any screencasts.  Please be patient – I will have something to show very soon.
  2. I am pleased to report that the implementation of securing the appropriate LTI launch parameters (i.e. secret and password) has been completed. Each launch node which contains the settings, now will have a sub-node which can only be read or modified by an administrative account. This sub-node contains the LTI secret and password. The service then masks the complexity of the implementation with this sub-node so that the user interface contract did not change. So the end result is that if the LTI related nodes are found via search or some other mechanism, we no longer need to be concerned that any related sensitive data will be exposed to unprivileged users.
  3. I am also pleased to say that some significant progress has been made on a new user interface for this widget. Using the code from the Remote Content widget is proving to be quite useful and delivering a nice user experience. There has been some debate about whether the Preview capabilities from the Remote Content widget are appropriate for this use case, but assuming that adding these capabilities to the service are not too time consuming, I plan on including this feature for user testing if nothing else. I personally believe the preview capability is worth providing, but I am willing to be proved wrong.
  4. No progress has been made regarding virtual tool registrations. I believe this use case is not unique and will extend beyond the Basic LTI needs. I have been procrastinating on this topic until I have the time to engage in the discussion fully.  I expect that will be when the other action items are complete and we can apply focus to this one particular aspect of delivery.
  5. Some good progress has been made on the Sakai 2 BasicLTI provider. Through my work on BLTI-24, the provider is a bit more robust. And through Steve Swinsburg’s work on BLTI-31 we are now closer to having a tighter relationship between the Sakai 3 consumer and the Sakai 2 provider. This tighter trust relationship assumes the consumer is closely related to the provider (e.g. maybe within the same domain). I envision using this high trust configuration in conjunction with some auto-provisioning to keep a Sakai 2 and Sakai 3 system in synchronization, which we expect to be common for most hybrid implementations.

Leave a comment

Filed under Education, Sakai, Technology

Importing content from Sakai 2 into Sakai 3 (take 2)

Since returning from holiday, I have rejoined the matter of Importing content from Sakai 2 into Sakai 3. The first order of business was to refactor the XML parsing from SAX to StAX to deal with some potentially nasty classloader issues as suggested by Dr. Ian Boston. That went smoothly and I have to say that after using SAX, StAX is a much improved utility as you have more control and can pull events from the stream rather than having them pushed to you. This makes for more natural and supportable java code.

Next, there were some improvements that I wanted to make to the import code:

  1. Support for org.sakaiproject.content.types.urlResource types.
  2. Adding the metadata to the imported content.

After a week of plugging away on both fronts, some good progress has been made. First, a model conversion had to be considered for org.sakaiproject.content.types.urlResource types. In Sakai 2, these URL resources are simply presented in the UI as hyperlinks that open in a new window. Given the RESTful nature of Kernel2 (K2), I needed to decide how to best represent a hyperlink. My first thought was using the proxy capabilties of K2, but that presented some issues as proxy nodes must be stored under /var/proxy and the whole notion of proxying http requests has security implications – that is why K2 does not allow just anyone to create a proxy node.

I was probably too close to the problem and had trouble seeing the more obvious solution – why not use a http redirect? After noodling the problem for a while, the simpler solution finally entered my brain. After a bit of acking, I found that Sling already has support for redirects through its RedirectServlet which binds to sling:resourceType=sling:redirect. So then, it was just a fairly simple matter of creating a node, and setting the properties accordingly:

"jcr:created":"Wed Oct 07 2009 13:53:00 GMT-0400",
"jcr:lastModified":"Wed Oct 07 2009 13:53:00 GMT-0400",

That was pretty much it for org.sakaiproject.content.types.urlResource types; the redirect works as expected. There are still a couple of things I would like to improve in this area:

  1. The node names for these urlResources need to be beautified. As the resource name comes through in the content.xml, it looks like “http:__sakaiproject.org_”. I have to strip out the “:” to avoid a JCR exception, so the node name currently looks like “http__sakaiproject.org_”. Ideally, it would match the display name, i.e. “”. Perhaps some manner of escaping invalid characters might work, but further digging into the JCR is required. I am able to set “sakai:filename”:””, so maybe that is good enough; TBD.
  2. Since the jcr:primaryType==nt:unstructured, the URL is rendered as a folder when connected via WebDAV. It would be nice to get these URLs to render as a leaf node instead. I experimented with jcr:primaryType=nt:file, but ran into some roadblocks and backed off.

Regarding the mapping of metadata, that task proved to be mostly straightforward for the fields that have a one-to-one mapping. However, there are currently more supported metadata fields in Sakai2 than there are in Sakai3. There is no limitation on the number or type of metadata fields that can be stored in Sakai3, so I am considering just storing all of fields from Sakai2 just as a precaution and possible future-proofing. I am left wondering whether to store them with their current keys or to prepend something like “sakai2:” to all of the keys before storing them.

Looking towards the near term, I am likely to look into the following issues:

  1. All user uploaded content is currently stored in a BigStore under /_user/files. After discussing with Ian Boston, I will most likely refactor the import code to store its content in that BigStore as well. Although, the BigStore concept will likely be redesigned in the near future, so any work I do in this area will be nicely abstracted so that this behavior can be changed easily when and if BigStore is redesigned.
  2. With the move to BigStore, I will have to take a look at access control lists (ACLs) so the the user importing content will have the proper permissions.
  3. Next, I need to take a look at the contract between K2 and the “Content & Media” widget so that the imported content appears properly within the user interface.
  4. What about other content types that could be imported today? Content from the Forums tool may be a good candidate as K2 currently has support for threaded discussions. Chat might be another place to look… Other ideas?

Regarding the Sakai2+3 Hybrid mode, I have hopes to arrange a two day coding sprint with Dr. Chuck Severance and Noah Botimer to develop a BasicLTI consumer for Sakai3. This would allow us to easily place a Sakai2 tool within the context of a Sakai3 site. With any luck, we will get this sprint organized by the end of January. Until next time, L

1 Comment

Filed under Java, Sakai, Technology

maven2 bash completion complete

I have been utterly spoiled by bash completion when using svn and git for the past few months – the only thing that was missing was maven completion.  Since I could not sleep this morning, I set out to fix that.  First, a little bit of background.  I have been using MacPorts to install both subversion and git.  Both had a variant “+bash_completion” – I did not know what it did at the time, but it sounded cool so I included that variant when I installed them.

git-core @
subversion @1.6.5_0+bash_completion+no_bdb
For example: sudo port install git-core +bash_completion +doc

After digging a bit further, I figured out that I need to add the following lines to ~/.profile to get bash_completion to take off:

if [ -f /opt/local/etc/bash_completion ]; then
. /opt/local/etc/bash_completion

On the surface you might think that completion might only be aware of common command line arguments to svn and git binaries, but they are actually a little smarter. For example, in my git repository typing “git checkout <TAB>” will list all of the branches in the repository! Very handy!

So now, how to get maven2 commands into bash completion. I started with the first Google hit: Guide to Maven 2.x auto completion using BASH. That worked, but it was missing a lot of the commands I wanted easy access to and it was not obvious to me how to extend their script.  Next, Google lead me to another hit: Maven Tab Auto Completion in Bash. This script had more completions out of the box and it was obvious how to add more. With some quick hacking, my /opt/local/etc/bash_completion.d/m2 now looks like:

# Bash Maven2 completion
local cmds cur colonprefixes
cmds="clean validate compile test package integration-test \
verify install deploy test-compile site generate-sources \
process-sources generate-resources process-resources \
eclipse:eclipse eclipse:add-maven-repo eclipse:clean \
idea:idea -DartifactId= -DgroupId= -Dmaven.test.skip=true \
-Declipse.workspace= -DarchetypeArtifactId= \
netbeans-freeform:generate-netbeans-project \
tomcat:run tomcat:run-war tomcat:deploy \
sakai:deploy -Predeploy \
dependency:analyze dependency:resolve \
versions:display-dependency-updates versions:display-plugin-updates \
javadoc:aggregate javadoc:aggregate-jar \
# Work-around bash_completion issue where bash interprets a colon
# as a separator.
# Work-around borrowed from the darcs work-around for the same
# issue.
COMPREPLY=( $(compgen -W '$cmds' -- $cur))
local i=${#COMPREPLY[*]}
while [ $((--i)) -ge 0 ]; do
return 0
} &&
complete -F _mvn mvn

You will notice that I have added the common Sakai goals like sakai:deploy or -Predeploy. I have also added some other maven plugins that I find useful. Give it a try: “mvn <TAB><TAB>” or maybe “mvn sak<TAB>” or how about “mvn ecl<TAB>”. I hope you will find bash completion just as satisfying as I do.  Best, L


Filed under Java, Technology

Sakai 3 export/import formats research

Since Sakai 3 is a ground-up rewrite, there is plenty of opportunities to rethink assumptions that have accumulated over the years.  One of those areas where a fresh look should do some good is in course export/import formats.  Sakai 2 has its own proprietary format which provides a “full fidelity” capability to move from one course site to another without losing anything.  While not losing anything is desirable, it comes at quite a cost; i.e. not being compatible with anything else on the planet.  This approach is not uncommon and I see now that Moodle also has its own proprietary format.

While there are some good open standards for course export/import (e.g. IMS Common Cartridge or SCORM), they will not provide a “full fidelity” export/import workflow where one could export from Sakai 2 and then import into Sakai 3; i.e. some information would get lost in the translation.  Now, supporting these open standards would have other important benefits, they are not first order solutions for simply getting from Sakai 2->3.  My hopes are that one day an open standard could be expressive enough to cover such a use case, but the innovation curve in the applications and tools may always exceed the ability of a lowest common denominator solution.

With that said, we thought it would be beneficial to see if Moodle’s export/import format provided enough capability to move data from Sakai 2->3 without losing any critical structure.  If this worked out, we would have one great feature out-of-the-box: the ability to move from a course from Moodle to Sakai and vice-versa.  It will be interesting to see how this works out and I will keep you updated as progress is made…

Leave a comment

Filed under Education, Sakai, Technology

Tags for OS X Users – Finding Stuff

I have been experimenting with some tools over the past six months to help me get better organized and increase the chances I can actually find something on my computer. I started my exploration with a program called Together from Reinvented Software.  Together was a good place for me to begin exploring the set of tags I would use and to incorporate tagging into my daily work-flows.  The things I liked about Together:

  1. It did not lock my content up into some bizarro binary file that could never be parsed again.  Instead, it neatly organized all of the files you added to Together in your Documents folder.  Seemed like a pretty reasonable thing.
  2. The tagging UI was pretty fast and had auto-complete.

These aspects of Together kept me using it for a few months until I discovered:

  1. The tags themselves were locked up in some bizarro binary file format! That did not sit well with me thinking about years and years of collected tags. While the program does have some way to sync those tags to Spotlight comments, the author warns that it will slow the application down and consume huge amounts of resources.
  2. The application became slow and unresponsive – even though I did not have the Spotlight-tag-syncing, resource-sucking option turned on.
  3. The work-flow of drag-and-drop became too cumbersome and I wanted something more streamlined.

I continued trudging through my use of Together, when one day the MacUpdate Promo had an application called Tags by Gravity Applications advertised for steep discount. It sounded like a good fit – maybe too good to be true – but the price was right and was worth trying to see if it met the hype. What I liked:

  1. Your tags are not locked away – they are part of the file’s metadata. This should allow my tags to travel as long as Spotlight is around.
  2. Spotlight can search for the tags – try a search like: tag:receipt apple.
  3. The UI is lightweight and fast. It may feel a bit unprofessional (just a personal opinion), but it cuts mustard.
  4. It is integrated with almost every application I use, and thus knows what file I am trying to tag without some cumbersome drag-and-drop work-flow. Very smart and efficient.

I am glad to say that I am still using Tags on a daily basis and have built some Automator work-flows around it as well. And… They recently provided me with an update to get everything working smoothly with Snow Leopard. Overall, I am very satisfied with this solution – I am able to find things reliably.  And after using this work-flow for awhile, it does strike me that this should be a base capability in OS X. I do find some evidence that Apple is headed down this path if you try the following in Snow Leopard: Print -> Save as PDF -> then play around in the Keywords field. I am seeing auto-complete. Are you?

1 Comment

Filed under Personal, Technology, Tools

Sakai3 tool development; lessons learned from rapid prototyping in jQuery

So this one definitely got me outside of my comfort zone. This is also the first presentation I have given where I did not use even one slide! I seemed to have gotten favorable reactions from the audience – so I am going to share it.

Watch the Screen-cast

Description: Learn how you can use simple HTML, CSS, JavaScript, and jQuery technologies to build rapid prototypes that can be quickly user tested in short iterations. Unlike more traditional prototyping methods, jQuery allows you to build fully functional user interfaces that not only aid in the design process, but then actually become the tool code itself.

Presenters: Lance Speelmon, Indiana University

1 Comment

Filed under Sakai, Technology, Tools