Don’t worry…be XAPI !

6

Your LACE reporter Fabrizio Cardinali, CEO Skillaware, live from DevLearn 2015, MGM Grand Las Vegas

devlearn

The “innovation in the making” motto displayed all across  this year’s DevLearn conference and exhibit spaces at MGM Grand Las Vegas, September 30 to October 2

Don’t worry…be XAPI !

That was the tune this year at DevLearn Conference & Expo, a learning technologies event held annually at the MGM Grand Las Vegas Hotel & Casino. DevLearn is known to draw the leading learning technologies vendors, developers and technical opinion-makers from all around the world to celebrate innovation in learning technologies. In that respect, this year’s event was no different. I will note, however, that amongst the 1000+ attendees, you could spot only a few belonging to the always-experimenting EU research and the always-developing Asian government groups (a noticeable reduction compared to the multi-ethnic presence at DevLearn in former years).

Starting from a full house at pre-conference workshop “XAPI-Camp” (despite a 500 USD additional charge to attend) well organised by Megan Bowe and Aarron Silvers from makingbetter.us, to several well-positioned hot spot speeches at the main conference, to the long list of vendor exhibitions during the 2-day expo, and finally at the 50+ tables and projects showcased at DemoFest, XAPI was “the thing” to have at DevLearn last week.

Much like in the early days of SCORM-inflated abuse, XAPI turned out to be the sole common denominator to make platform vendors, corporate buyers, and service and content providers realize that, yes, they will need interoperability standards in order to make the corporate learning ecosystem they all dream of a reality.

20150930_085233

The MGM Grand Conference Room fully packed with DevLearn 1000+ attendees.

 “XAPI-compliant” was the magic phrase this year, making top gear corporate buyers smile and nod their heads when demoed new solutions, though of course no one informed them that in reality there is no body available (yet) entitled to (fully) certify such compliancy…

As a long-time learning standards rider and recurring learning technologies start-upper, I myself thought “XAPI readiness” was the natural thing to add in the top line of our glossy new corporate brochure  prepared for the U.S. launch of our (insistently European) new LT solution at the event (Skillaware™, an innovative performance support and learning analytics platform blending XAPI, DITA and BPMN-based interoperability standards to support rapid and enterprise-scale workforce on-boarding during the rollout of new software platforms and procedures).

xapi

A XAPI graph displayed during the DevLearn days to summarise the concept.

SO, what was the worry after all such XAPIness?

Well, what concerned me was the intrinsic questionable nature of LT standards…OK, I know XAPI is still just a spec as long as ISO, CEN or at least IEEE don’t elevate it to an official standard, but it might just as well be dead before finishing the 5 to 10-year standardization roadmap that would require anyway. So let’s agree to call it a “candidate” de facto standard for the time being.

First of all, some think learning standards are an oxymoron. I don’t fall in this category. Reason being that, as a solution vendor, I tend to favor the technical interoperability aspect of LT standards over the pedagogical one underpinning the solution, which by definition should continuously adapt itself to learning situations and learner needs rather than standardize itself.

In the (technical) sense, XAPI is a huge leap forward from SCORM. Its intent is to reduce everything to a simple yet all-encompassing, highly granular triplet using nouns, verbs and objects, which any “data producer” can share with others consuming it, describing what is happening around the (learning) world in which the learner is performing.

In terms of being capable of tracking any user doing whatsoever, from reading an article in an e-Book to entering a (physical) classroom to utilizing a performance support aid while using a software tool (…and if the latter sounds like Skillaware’s SkillAgent™, you’re right!), XAPI has made great effort in fine-tuning the monitoring, evaluation and analysis of learning in the real world, not to mention evidence and outcomes related to it.

In fact, XAPI’s triplet statements can be sent to any 2nd party repositories, called learning record stores. They can also be sent (and here is the magic) to any independent 3rd party consumers wishing to track and analyze user activity and performances in real-time, not only so they can offer better and more appropriate buying or traveling advice, but also in order to better adapt the surrounding (eco)systems and solutions, all towards the goal of continuous performance appraisal and improvement.

IMG_20150929_100956

The LACE table set up at the XAPI CAMP during  preconference day.

So where are the cons?

The issue is that because the concept is so useful and simple, it will become attractive to any vertical, along the lines of how the first generation specs of LT interoperability (e.g. Content Packaging and Learning Design) became very interesting to the (then) emerging e-Publishing and social networking markets.

But…and yes, there is a but…while other digital market saw success and a tenfold increase in market share, none of the first generation LT standards (nearly all of which, by the way, were invented and funded in the EU) made it to the mainstream   digital market, aside from a niche adoption for LT interoperability. EPUB, MPEG and many others have been hugely far more successful than LT specs, despite the fact that they were more limited in terms of the initial intent and capability of their inventors and developers.

So what?

While it advances, XAPI will become both a huge opportunity and threat for the learning technologies industry, its stakeholders and their everlasting quest to become (truly) mainstream in the digital kingdom.

With the advent of XAPI, digital learning can now migrate everywhere and everywhere can potentially offer learning experiences.

At the same time, however, anyone will be able to claim to favor learning the right way. And in the end, the muscle power of the endorsing stakeholders, the public support and backing of initiatives, and finally the overall satisfaction of the end buyers will make the real difference on how fast a spec will be accepted, aligned and made mainstream. Hopefully, the market remains open and not in the hands of a single vendor.

Despite ADL is increasingly backing up what seems the next generation for learning technologies interoperability amongst its providers and XAPI is building up its best status for having that stewardship managed, LT communities are still arguing how to pass from mock-ups and pilots to real-field applications: there is a long road ahead and many miles remain to be crossed (e.g. the  means to access registries of  shared verbs vocabularies amongst all).

And what if someone wakes up and just gets it done, while we LT professionals continue debating and questioning as usual?

What will happen if eventually the Ciscos and Huaweis of IoT will agree that standardizing user activity tracking with and amongst their IoT magic boxes is not a bad idea after all? Or if the Siemens and Rockwells of the industrial world start understanding the crucial value of learning analytics interoperability to empower the rapid and extensive learning curve their smart manufacturing plants underpin?

They might just do it and blow the lid off of the LT industry while it sits and questions instead of uniting, taking action and implementing without again arguing if XAPI is really the best format  for exchanging (learning) analytics.

After all, it does not matter if what you are proposing to track is a “learning” activity by anyone else’s definition. If you are able to standardize and showcase the way you track interactions across different vendors of the value chain, be it with an e-Book or entering into a physical or virtual (class)room, you have a gold mine of opportunity ahead.

Smart standards, after all, are still developed by (smart) humans. And smart technicians tend to protect, preserve and show off their work, rather than converge and align to embrace someone else’s achievement.

Learning & Development departments across corporate America are feeling the pressure to improve their HR performance appraisal and improvement processes, but  they are still far less stressed compared to their manufacturing or retail counterparts. And signals are that the latter will act upon user activity tracking and analytics exchange standardisation much faster and much more  unite than learning stakeholders which are delaying action as usual, arguing on if and what is better to embrace.

LT standards can’t afford such delay, nor can they afford the sort of brotherhood-killing dualism we lived in the SCORM-IMS days. Believe me when I say that the question is not who is going to survive between XAPI or Caliper  or whatever the different learning communities might think of…. The question is if “learning” will survive “analytics” if IoT and cyber physical become mainstream before the LT industry takes action.

At the end, the winner takes all. I’t might take learning too.

 

 

 

Share.

About Author

Fabrizio Cardinali is one of the EU's leading technology enhanced learning solutions entrepreneurs and interoperability standards experts. After helping to start, position and sell several learning technologies companies world wide (e.g. Giunti Labs, eXact Learning NA in the US and Harvestroad Hive in Australia), Fabrizio is today CEO of Skillaware (www.skillaware.com) a new generation Performance Support & Learning Analytics solution for workforce training and engagement during the roll out of new software platforms and procedures. The Skillaware innovative design is natively based on open standards interoperability such as XAPI, DITA and BPMN. Fabrizio leads the Work Package 5 of the LACE project, dedicated to the promotion and awareness of the use of learning analytics solutions and standards in workplace learning and performance support scenarios.

6 Comments

  1. Nice post Fabrizio,

    I haven’t been in Devlearn but observed it from France via my numerous friends in L&D.

    I’m an IT professional and I’m in this xAPI soup for the last 24 months. I ended up with about the same conclusions as you.

    xAPI is way too generic. It’s just a box with compartments. It requires an effort in defining semantic on top of it, nobody is ready/willing to start.

    The governance of the project is questionable as well vendors practices. I’ve been in many standardization committees and I don’t see xAPI ready to start this process.

    What has been done to now can be catched-up by a new entrant in a few months.

    I’m pleading for an xAPI 2.0 based on a proper governance and business rules.

    -Bruno

    • Bruno,

      I, as always, appreciate you putting it out there and speaking your mind. I hope you don’t mind if I try to address your comment here.

      > xAPI is way too generic. It’s just a box with compartments. It requires an effort in defining semantic on top of it, nobody is ready/willing to start.

      I hold that the fact that xAPI is generalized is among its core strengths. While it makes it really hard to just “turn the key” to get a solution out of it, I think the effort for defining semantics is fair to demand of a community for whom data ownership is valuable. We have to understand, at least in part, how data gets made. And, to your second point, there *are* already cohesive efforts to start addressing the semantics in a comprehensive and unified way… and in the next year I believe those efforts will scale.

      > The governance of the project is questionable as well vendors practices. I’ve been in many standardization committees and I don’t see xAPI ready to start this process.

      This feels like a loaded statement. I’ll simply ask, what vendors are questionable, and what practices of theirs are questionable? Hard to address this without specifics, and if it’s something for email or a call, I’m happy to chat with you about it.

      As for whether anyone is ready to start the process…. well…

      http://xapiquarterly.com/2015/09/the-way-of-xapis-consortium/

      > What has been done to now can be catched-up by a new entrant in a few months.

      I guess that depends on what aims a “new entrant” has and how one defines “wha that been done to now.” I disagree with such a dismissal of the comprehensive open effort that’s been put forward to-date, but we can disagree on this.

      > I’m pleading for an xAPI 2.0 based on a proper governance and business rules.

      I can promise you there will be governance and business rules in DISC, the consortium that will steward xAPI… but as we’re in the process of forming the bylaws, I can tell you I’m keen to maintain the energy the community has, structure (and when we can, federate) collective resources and put in place a board that reflects and respects the varied applications of xAPI.

      That this will not be your grandpa’s standards body, is what I’m suggesting. :)

  2. Interesting perspectives.

    Since the xAPI specification isn’t very rigid it might seem half-baked when you try to use it to implement something. I guess it might have been created that way intentionally? They’re hoping that the e-learning industry will actually unite, use this as a building block and build best practices around it? I agree that xAPI definitively is a good step in the right direction and something the global e-learning community can work with.

    I’ve been implementing “xAPI compliant” 😉 interactive content for some months now, and it is really hard to find examples and best practices. You usually have to choose between different verbs for instance, where none of the existing ones are perfect, but you don’t want to create yet another one. A lot of the documentation that do exists doesn’t allow comments and thus makes it hard for people to help improve it and discuss it.

    I’m contributing to an open source project called H5P and we’ve off course decided to publish our xAPI coverage. We will continually improve on how we document our xAPI coverage and facilitate discussions around it. Hopefully by doing this we can make a small contribution towards generating recipes and best practices. We urge others to do the same. I think more openness around the implementations of xAPI will make xAPI improve much quicker by identifying and spreading best practices.

    Here you can find our xAPI coverage for Drag the words:

    https://h5p.org/node/1396/xapi-coverage

    On https://H5P.org you can also find xAPI coverage for our other interactive content types like Interactive Video, Interactive Presentation and Question Set.

    We hope others will reuse our practices and help us improve by providing feedback.

    • Although xAPI spec. requires each vocabulary has unique id, it seems very likely there will be multiple identities used by different teams for the same behavior. One possible reason is xAPI best practice is evolving, so history management is inevitable. Another possible reason is human’s interpretations and choices on one behavior could be different, especially across different languages(vocab in different languages aren’t having 1-to-1 mapping in semantic meanings). It’s not a problem of arguing who’s compliant to any governance or best practice, it’s a problem of managing the reality to extract analytical information. (this management is common in BI domain)
      If you see xAPI key-value pairs as data containers, it’s more important to design a set of containers to capture data that are necessary for learning analytics outcomes, than defining which vocabulary should be used for which behavior. For example, if verb1, verb2, verb3 are used for the same behavior, we could map them to the same thing in visualizations and analytics later(that’s easy for computer). But if there is one data absent that’s necessary for some analysis, we can’t do anything about it. (you’ll need to guess, a lot more human work will be needed)
      This is real experience sharing. We use the method suggested by Bruno Winck to handle multiple identities. Thanks Bruno!
      http://kneaver.com/blog/2015/08/how-to-deal-with-multiple-identities-on-xapi/
      Of course this comment isn’t to encourage people use different vocabularies, but to share it’s inevitable from our experience.
      – Jessie

  3. I like the generality of xAPI, so that we can adopt it under very diversified situations and resource constraints, improve the practices step by step.

    xAPI has its open spirit since its birth, not only the standard spec., also CoP self-governance. The only barrier is the knowledge about building effective learning analytics and available tools, not expensive membership and certification fee.

    Because my major projects are serving education domain, also I just finished an article about xAPI and OER. This is the first time I am excited about xAPI and learning analytics(LA) could save the education issues around the world. Those underprivileged children without any teacher helping them can learn independently with OER.
    In the past, all LA researches are limited by the data size(data silos) and data types(low level log data only), that why LA isn’t playing the role it could achieve. Even with xAPI, if there are no semantic interoperability and shared recipes, silos are still silos.

    I just wish xAPI won’t become some other standard sponsored by businesses, and serve businesses only.

    I have confidence in you, Aaron!

    Jessie

Leave A Reply