Sakai 2->3 Integration Status Update

David Goodrum asked me to provide an update to the Oncourse Priorities Committee [1] regarding the progress made towards Sakai 2->3 integration and migration. In doing so, it became clear that this kind of update might also be useful to the community at large. There are two overarching goals to achieve in this project: 1) seamlessly exposing Sakai 2 functionality within Sakai 3 (which we are dubbing “hybrid” mode) and 2) content migration.

The hybrid mode will allow institutions to adopt Sakai 3 at a pace that is comfortable for them. For example, Sakai 3 will not have all of the functionality available in Sakai 2 day one, but their will be some compelling functionality that institutions will want to adopt before all of the functionality gaps have been closed. Hybrid mode will facilitate this approach by allowing institutions to deploy Sakai 3 and take advantage of the new features while still relying on Sakai 2 for the features not yet available in Sakai 3. The following video demonstrates the first step in realizing hybrid mode: the ability to expose entire Sakai2 sites within the Sakai 3 portal.

We anticipate that institutions will want to start using Sakai 3 for full-fledged project and committee work before course work. This capability supports this use as Sakai 3 sites can be used for project sites while Sakai 2 continues to serve the course site needs. The next step in the hybrid mode evolution will be the ability to place individual Sakai 2 tools onto Sakai 3 pages which live in Sakai 3 sites. This will give the user the ability to mix and match both Sakai 2 and Sakai 3 functionality in a single user experience.

Next: content migration. The primary use case that we are tackling first is the ability of an instructor who taught a class last semester in Sakai 2 to be able to import that site template into Sakai 3. The following video demonstrates an initial deliverable where one can export content from a Sakai 2 site and then import that content into a Sakai 3 site. Please keep in mind that this is very early work and is rough around the edges, but the methodology and underlying service is sound.

Moving forward, focus in the following areas is required:

  1. Exposing individual Sakai 2 tools on Sakai 3 pages within Sakai 3 sites.
  2. Exploring a results mechanism so that a tool hosted in Sakai 2 can report results to a tool hosted in Sakai 3 (and vice-versa). As an example, you might want to use a Sakai 3 Assignments tool which will need to report outcomes to a Sakai 2 Gradebook.
  3. Continued effort to import Sakai 2 site templates into Sakai 3 for semester-to-semester transitions.
  4. Deeper content migration that will not only transfer site templates, but also include full user content. For example, this will likely be a requirement to migrate a project site from Sakai 2->3.

3 Comments

Filed under Sakai

Sakai 3 Basic LTI Widget Sprint

I am pleased to update the Sakai community on the outcomes of a two day coding sprint to produce a Basic LTI consumer widget for Sakai 3. Dr. Charles Severance, Noah Botimer, and I participated in the sprint which the University of Michigan CTools team was kind enough to host. Don’t worry if you are not familiar with the Basic LTI specification. It is an up and coming specification from IMS which is currently in draft status. In a nutshell, it allows separate applications to be loosely integrated through an IFRAME. In the first iteration, known as “Basic” LTI, the remote system trusts some user information from the local system such as the identity of the user and the roles of the user (e.g. Instructor, Learner, etc.), and then automatically provisions accounts and tools appropriately. Essentially, this allows a remote tool to appear in a learning management system (LMS) as if it were a local tool. From an industry perspective, this will allow a standard way for say publishers to provide hosted textbook content to a variety of LMSs. It could also create an environment where we begin to select tools à la carte for inclusion into a local LMS implementation; i.e. selecting the tools that best fit the pedagogy used by the instructor. For a quick primer on LTI, I would suggest watching the video IMS Basic Learning Tools Interoperability by Dr. Chuck.

So the reason that I wanted to explore LTI was perhaps a bit less lofty. Simply, I wanted to use it as a mechanism to place a Sakai 2 tool into a Sakai 3 site as a widget on a page. The work that has been accomplished to date (see: Sakai 2+3 Hybrid Status Update) has been largely about exposing entire Sakai2 sites in the Sakai 3 portal. This work is still very much applicable, but supports a different use case than what I hoped to accomplish with an LTI widget.  In this case, we want to be able to support a mixture of Sakai 2 and Sakai 3 tools and widgets within the context of a Sakai 3 site. This will help us make the transition from 2–>3 without a “big-bang” approach. For example, Sakai 3 will not have a PostEm tool day one, and that should not present a roadblock to Sakai 3 adoption. You should simply be able to place the PostEm from Sakai 2 in your Sakai 3 site along with all of the other native Sakai 3 widgets.

So after two days of intense coding, we had a widget that could consume a sample Basic LTI Provider. Many thanks again for the help from Dr. Chuck and Noah Botimer! You guys are awesome! There was certainly more work to do, but the plumbing had been laid and now I could focus on refining working code. Beyond the mechanics of creating an LTI launch, we wanted to add some settings to the widget:

  1. Create “virtual tool” registrations so that the same widget could be provisioned many times under different names; i.e. the same widget could be reused for sakai2.calendar, sakai2.postem, sakai2.etc.).
  2. Allow for administrative control over Basic LTI widget placements; i.e. the system administrator could define default values and/or lock settings that could not be modified by instructors. For example, you could use these locked settings to create the virtual tols described in #1.
  3. Settings to control user privacy:
    1. Should the user’s name be released to the remote system (i.e. first, last, full names)?
    2. Should the user’s email address be released to the remote system?
    3. Should the user’s user-name be released to the remote system?

I spent the next week wrapping up these enhancements to the code. The next step was to invoke the Basic LTI provider that is currently slated for release in Sakai 2.7.0. I mean that was the whole point of this exercise after all!  In doing so, I discovered a bug that cause the Sakai 2 provide to choke – it did not like the “/” character in my context_id variable; see: BLTI-23. The shortest path to resolution was rolling my sleeves up and fixing the code. So along the way, I accepted an invitation got wrangled into becoming a committer on the BLTI project! After fixing BLTI-23, we decided that the length of the Sakai 2 siteId could be a problem as we cannot predict the length of context_id that remote systems will pass, so next came BLTI-24. 🙂 A few commits later, and the we were in business! Yeah! I could display a Sakai 2 within a Sakai 3 site placed as a widget on a page.

So what is next?

  1. At least one screencast demonstrating the work to date – probably two. One showing the Basic LTI plumbing and passing of the LTI certification tests, and another showing Sakai2 tool placements in Sakai 3 sites.
  2. Some of the information in an LTI launch needs to be secured from end users. This will be a chance for me to learn more about access control in Sakai 3; see: KERN-591.
  3. The widget itself needs some renovation. I will likely copy the existing Remote Content widget and add the LTI settings.
  4. Create some virtual tools for existing Sakai 2 tools and add these to the out of box Sakai 3 experience.
  5. Work on the Sakai 2 Basic LTI provider servlet to make it more robust and support the specific Sakai 2/3 integration use cases.

2 Comments

Filed under Sakai

Sakai 2+3 Hybrid Status Update

I am happy to provide you with a status update on the progress that has been made since the previous post: Sakai 2+3 Hybrid pre-alpha. I think you will be most pleased with the outcomes and in many ways provides a better user experience than what is available in the Sakai 2 portal! The speed at which you can switch between tools is quite amazing. Check it out:

Many thanks to: Ian Boston, Paul Bristow, Oszkar Nagy, Christian Vuerings.

1 Comment

Filed under Sakai

Importing content from Sakai 2 into Sakai 3 (take 3)

As the week begins winding down, I wanted to draw your attention to some progress I have made in this area since my last blog entry: Importing content from Sakai 2 into Sakai 3 (take 2). First, I would like to point you to a screencast I recorded on the subject earlier today:

Next, I would like to address some of the action items from the last post:

  1. All user uploaded content is currently stored in a BigStore under /_user/files. After discussing with Ian Boston, I will most likely refactor the import code to store its content in that BigStore as well. Although, the BigStore concept will likely be redesigned in the near future, so any work I do in this area will be nicely abstracted so that this behavior can be changed easily when and if BigStore is redesigned.
  2. With the move to BigStore, I will have to take a look at access control lists (ACLs) so the the user importing content will have the proper permissions.
  3. Next, I need to take a look at the contract between K2 and the “Content & Media” widget so that the imported content appears properly within the user interface.

This work is complete and you can see it working in the video. The refactor to the /_user/files BigStore was pretty straightforward, except for one self inflicted bug, and luckily the call to FileUtils.saveFile() already handled all of the access control permissions for me. The Content & Media widget expects files to be stored under /_user/files, so things started behaving as expected after #1 was complete.

I do have the plans confirmed with Dr. Chuck and Noah Botimer to spend two days developing a Sakai 3 BasicLTI consumer. I am excited to tackle that problem – coding sprints are always full of such great energy! Until next time, Lance

Leave a comment

Filed under Sakai

Importing content from Sakai 2 into Sakai 3 (take 2)

Since returning from holiday, I have rejoined the matter of Importing content from Sakai 2 into Sakai 3. The first order of business was to refactor the XML parsing from SAX to StAX to deal with some potentially nasty classloader issues as suggested by Dr. Ian Boston. That went smoothly and I have to say that after using SAX, StAX is a much improved utility as you have more control and can pull events from the stream rather than having them pushed to you. This makes for more natural and supportable java code.

Next, there were some improvements that I wanted to make to the import code:

  1. Support for org.sakaiproject.content.types.urlResource types.
  2. Adding the metadata to the imported content.

After a week of plugging away on both fronts, some good progress has been made. First, a model conversion had to be considered for org.sakaiproject.content.types.urlResource types. In Sakai 2, these URL resources are simply presented in the UI as hyperlinks that open in a new window. Given the RESTful nature of Kernel2 (K2), I needed to decide how to best represent a hyperlink. My first thought was using the proxy capabilties of K2, but that presented some issues as proxy nodes must be stored under /var/proxy and the whole notion of proxying http requests has security implications – that is why K2 does not allow just anyone to create a proxy node.

I was probably too close to the problem and had trouble seeing the more obvious solution – why not use a http redirect? After noodling the problem for a while, the simpler solution finally entered my brain. After a bit of acking, I found that Sling already has support for redirects through its RedirectServlet which binds to sling:resourceType=sling:redirect. So then, it was just a fairly simple matter of creating a node, and setting the properties accordingly:

{
"sling:resourceType":"sling:redirect",
"sling:target":"http://sakaiproject.org/",
"sakai:id":"AirbV1U-",
"sakai:user":"admin",
"jcr:mimeType":"text/url",
"sakai:filename":"http://sakaiproject.org/",
"jcr:created":"Wed Oct 07 2009 13:53:00 GMT-0400",
"jcr:lastModified":"Wed Oct 07 2009 13:53:00 GMT-0400",
"jcr:primaryType":"nt:unstructured"
}

That was pretty much it for org.sakaiproject.content.types.urlResource types; the redirect works as expected. There are still a couple of things I would like to improve in this area:

  1. The node names for these urlResources need to be beautified. As the resource name comes through in the content.xml, it looks like “http:__sakaiproject.org_”. I have to strip out the “:” to avoid a JCR exception, so the node name currently looks like “http__sakaiproject.org_”. Ideally, it would match the display name, i.e. “http://sakaiproject.org/”. Perhaps some manner of escaping invalid characters might work, but further digging into the JCR is required. I am able to set “sakai:filename”:”http://sakaiproject.org/”, so maybe that is good enough; TBD.
  2. Since the jcr:primaryType==nt:unstructured, the URL is rendered as a folder when connected via WebDAV. It would be nice to get these URLs to render as a leaf node instead. I experimented with jcr:primaryType=nt:file, but ran into some roadblocks and backed off.

Regarding the mapping of metadata, that task proved to be mostly straightforward for the fields that have a one-to-one mapping. However, there are currently more supported metadata fields in Sakai2 than there are in Sakai3. There is no limitation on the number or type of metadata fields that can be stored in Sakai3, so I am considering just storing all of fields from Sakai2 just as a precaution and possible future-proofing. I am left wondering whether to store them with their current keys or to prepend something like “sakai2:” to all of the keys before storing them.

Looking towards the near term, I am likely to look into the following issues:

  1. All user uploaded content is currently stored in a BigStore under /_user/files. After discussing with Ian Boston, I will most likely refactor the import code to store its content in that BigStore as well. Although, the BigStore concept will likely be redesigned in the near future, so any work I do in this area will be nicely abstracted so that this behavior can be changed easily when and if BigStore is redesigned.
  2. With the move to BigStore, I will have to take a look at access control lists (ACLs) so the the user importing content will have the proper permissions.
  3. Next, I need to take a look at the contract between K2 and the “Content & Media” widget so that the imported content appears properly within the user interface.
  4. What about other content types that could be imported today? Content from the Forums tool may be a good candidate as K2 currently has support for threaded discussions. Chat might be another place to look… Other ideas?

Regarding the Sakai2+3 Hybrid mode, I have hopes to arrange a two day coding sprint with Dr. Chuck Severance and Noah Botimer to develop a BasicLTI consumer for Sakai3. This would allow us to easily place a Sakai2 tool within the context of a Sakai3 site. With any luck, we will get this sprint organized by the end of January. Until next time, L

1 Comment

Filed under Java, Sakai, Technology

Importing content from Sakai 2 into Sakai 3 (take 1)

Development was starting to slow down for me on the Sakai2+3 Hybrid Mode, so I needed to turn my primary focus elsewhere. Michael Korcuska and I had decided previously that the next focus point would be to develop a working prototype that would allow someone to take a zip file exported from Sakai 2’s Site Archive tool and import the content into Sakai 3. Initially the scope would be limited to just the content contained within the Resources tool (a.k.a. ContentHostingService) since Sakai 3 currently has enough functionality to support the files and folders model.

When I started down this path, I did not expect to reach a stopping point by the end of the week. Frankly I thought it would take longer. But after a couple of days, I had the logic around parsing the content.xml file and extracting the content into my local file system working pretty well. The next couple of days were spent porting this working code into Kernel2 as a SlingServlet and creating a RESTful web service. After a couple of bumps in the road and someone moving my cheese, I am pleased to say that the first iteration of this work is complete.

As an example, you can take the sample archive.zip file which came from a Sakai 2 test instance, and upload it to Sakai 3:

curl -F"path=/site/import/folder" -F"Filedata=@archive.zip" http://username:password@localhost:8080/foo.sitearchive.json

The web service expects two parameters:

  1. path: The path to a folder where you want the content imported.
  2. Filedata: one or more zip files to import.

The result will be a folder that looks like the following screen shot:

While there is still much to be done (e.g. mapping file meta-data, support for more resource types, etc.), this is an important first step. For one, it demonstrates technical feasibility. Secondly, it creates the beginnings of a framework that can be extended to support importing other Sakai 2 tools, and eventually other import formats entirely like IMS Common Cartridge. If you are interested in looking at the code, it can be found at my github repository.

Looking forward, I will likely begin investigating IMS Basic LTI as a mechanism to enhance the Sakai 2+3 Hybrid capabilities. Currently, the hybrid mode supports entire sites (i.e. the user chooses to enter either a Sakai 3 site or a Sakai 2 site via the Sakai 3 portal). Ideally, one should be able to mix and match tools from either Sakai 2 or 3 in a Sakai 3 site. Dr. Chuck has done some good work in this area – Sakai 2.7.0 will have both a BasicLTI consumer and producer. So theoretically, if Sakai 3 had a BasicLTI consumer, it could present a Sakai 2 tool to a user as a Sakai 3 widget. My hopes are that among Dr. Chuck, Noah Botimer, and myself that we could turn out a Sakai 3 LTI consumer relatively quickly. More to come in the new year. Best regards, L

7 Comments

Filed under Java, Sakai

maven2 bash completion complete

I have been utterly spoiled by bash completion when using svn and git for the past few months – the only thing that was missing was maven completion.  Since I could not sleep this morning, I set out to fix that.  First, a little bit of background.  I have been using MacPorts to install both subversion and git.  Both had a variant “+bash_completion” – I did not know what it did at the time, but it sounded cool so I included that variant when I installed them.

git-core @1.6.5.3_0+bash_completion+doc
subversion @1.6.5_0+bash_completion+no_bdb
For example: sudo port install git-core +bash_completion +doc

After digging a bit further, I figured out that I need to add the following lines to ~/.profile to get bash_completion to take off:

if [ -f /opt/local/etc/bash_completion ]; then
. /opt/local/etc/bash_completion
fi

On the surface you might think that completion might only be aware of common command line arguments to svn and git binaries, but they are actually a little smarter. For example, in my git repository typing “git checkout <TAB>” will list all of the branches in the repository! Very handy!

So now, how to get maven2 commands into bash completion. I started with the first Google hit: Guide to Maven 2.x auto completion using BASH. That worked, but it was missing a lot of the commands I wanted easy access to and it was not obvious to me how to extend their script.  Next, Google lead me to another hit: Maven Tab Auto Completion in Bash. This script had more completions out of the box and it was obvious how to add more. With some quick hacking, my /opt/local/etc/bash_completion.d/m2 now looks like:

# Bash Maven2 completion
#
_mvn()
{
local cmds cur colonprefixes
cmds="clean validate compile test package integration-test \
verify install deploy test-compile site generate-sources \
process-sources generate-resources process-resources \
eclipse:eclipse eclipse:add-maven-repo eclipse:clean \
idea:idea -DartifactId= -DgroupId= -Dmaven.test.skip=true \
-Declipse.workspace= -DarchetypeArtifactId= \
netbeans-freeform:generate-netbeans-project \
tomcat:run tomcat:run-war tomcat:deploy \
sakai:deploy -Predeploy \
dependency:analyze dependency:resolve \
versions:display-dependency-updates versions:display-plugin-updates \
javadoc:aggregate javadoc:aggregate-jar \
source:aggregate"
COMPREPLY=()
cur=${COMP_WORDS[COMP_CWORD]}
# Work-around bash_completion issue where bash interprets a colon
# as a separator.
# Work-around borrowed from the darcs work-around for the same
# issue.
colonprefixes=${cur%"${cur##*:}"}
COMPREPLY=( $(compgen -W '$cmds' -- $cur))
local i=${#COMPREPLY[*]}
while [ $((--i)) -ge 0 ]; do
COMPREPLY[$i]=${COMPREPLY[$i]#"$colonprefixes"}
done
return 0
} &&
complete -F _mvn mvn

You will notice that I have added the common Sakai goals like sakai:deploy or -Predeploy. I have also added some other maven plugins that I find useful. Give it a try: “mvn <TAB><TAB>” or maybe “mvn sak<TAB>” or how about “mvn ecl<TAB>”. I hope you will find bash completion just as satisfying as I do.  Best, L

2 Comments

Filed under Java, Technology

Sneak Preview: Sakai 2+3 Hybrid pre-alpha

This video gives you the first glimpse of the hybrid integration between Sakai 2 and Sakai 3. This is a very early look and is not showing finished product, but instead enough of the user interface to begin real discussion and refinement.

Many thanks to: Ian Boston, Paul Bristow, Oszkar Nagy, Christian Vuerings.

4 Comments

Filed under Sakai

Investigating site exports from Sakai 2

So the export/import file format investigation has reached some early conclusions regarding the use of Moodle’s backup schema and it looks like we will be looking elsewhere. See: Moodle export-import format investigation and the email thread itself.

While we ponder IMS Commmon Cartridge, I thought I would investigate what it would take to provide the capability of exporting Sakai 2 sites into the existing Sakai 2 proprietary XML format. This is a long standing request within the Sakai community, but one that no one has been willing to tackle. This is a bit of a dodgy situation as most tools do participate in the method EntityTransferrer.transferCopyEntities(), so it is possible to copy the structure of a site from semester to semester. I use the term “structure” because this is common practice among LMS applications to only copy what might be termed a “template” across semesters. For example, this copy process would include content like forum definitions, but not student responses; grade book items, but not student grades, etc.  The primary use case is an instructor who taught a class last semester can import that previous site into the current semester’s course site to reduce setup time.

So far so good – but here is where things get a bit dodgy… The EntityTransferrer.transferCopyEntities() method copies entities directly from one site to another (i.e. without writing any of these entities to XML). While Sakai 2 does have a mechanism for writing entities to XML, called ArchiveService.archive(),there are at least two problems with it: 1) Unlike transferCopyEntities(), all student positngs, grades, etc. are included in the XML produced (i.e. more like  a site backup), and 2) only a small subset of tools actually implement the ArchiveService.archive() interface! So this leaves me wondering:

  1. Does anyone actually depend on ArchiveService.archive()? My instincts tell me no since most of the tools do not implement it. Am I wrong?
  2. Could we usurp the ArchiveService.archive() interface and change the behavior so that only site structure is exported without student content?
  3. Do we leave ArchiveService.archive() alone and create a new API?
  4. How many tools still need to implement archive()?


1 Comment

Filed under Java, Sakai

Sakai 3 export/import formats research

Since Sakai 3 is a ground-up rewrite, there is plenty of opportunities to rethink assumptions that have accumulated over the years.  One of those areas where a fresh look should do some good is in course export/import formats.  Sakai 2 has its own proprietary format which provides a “full fidelity” capability to move from one course site to another without losing anything.  While not losing anything is desirable, it comes at quite a cost; i.e. not being compatible with anything else on the planet.  This approach is not uncommon and I see now that Moodle also has its own proprietary format.

While there are some good open standards for course export/import (e.g. IMS Common Cartridge or SCORM), they will not provide a “full fidelity” export/import workflow where one could export from Sakai 2 and then import into Sakai 3; i.e. some information would get lost in the translation.  Now, supporting these open standards would have other important benefits, they are not first order solutions for simply getting from Sakai 2->3.  My hopes are that one day an open standard could be expressive enough to cover such a use case, but the innovation curve in the applications and tools may always exceed the ability of a lowest common denominator solution.

With that said, we thought it would be beneficial to see if Moodle’s export/import format provided enough capability to move data from Sakai 2->3 without losing any critical structure.  If this worked out, we would have one great feature out-of-the-box: the ability to move from a course from Moodle to Sakai and vice-versa.  It will be interesting to see how this works out and I will keep you updated as progress is made…

Leave a comment

Filed under Education, Sakai, Technology