Thursday, May 2, 2013

Lessons Learned

While the development of the Nature Explorer is reaching an end and I am getting ready for the final project presentation, it would be beneficial to look back and identify what this experience has taught me. Starting from the very first post at the beginning of February about ILMD definitions to the very last one concerning the collaborative effort of designing for educational games, I will try to identify the learning experiences that were gained and their applicability in other design situations.


ILMD understanding

My idea of interactive multimedia for learning has not changed much since the beginning of the project. Maybe to my extensive experience developing educational software has helped me have a pretty good idea about interactivity, multimedia, and learning and their interplay. Also, theories about interaction design from the human-computer interaction side have helped me a lot to comprehend the design and development process, especially the importance of iteration and rapid prototyping. Nature Explorer was just another case where these tools allowed me to be productive right away and get feedback from peers and testing at the early stages of design, informing my redesigned prototypes.


Problem or technology first?

In this next phase I came to realize the importance of problem identification as a priority in instructional technology. Coming from a technology background I was inclined to try and find useful applications from existing and/or novel technologies. Thus, it came natural to me to think in terms of cool technologies that could be exploited for learning purposes; the Kinect was one of the first things that came to mind. Although such technologies play an important motivational role in instructional design, they could very easily guide someone astray during the important problem identification phase. Through the followed process I learned to focus on identifying actual instructional challenges and opportunities for applying computer-assisted instruction. I do recognize though, that depending on the domain you are working for you may need to wear different hats (MAE's favorite expression) to address different problems.

We have a problem, and now what...?

My interest about informal learning led me, one way or another, to the  Price House Nature Center, a volunteer-run museum which was "waiting" for my assistance. The unique needs of the space imposed some design guidelines that came right to mind after my first meeting with the Associate Director; the system should be extremely easy to use, require no instructions, and should be  durable to withstand kids' abuse! Normally I would have started right away to test my ideas (which I almost did) first on paper and then with actual prototypes. The requirement to do some precedent study and actually try to align the client's needs with some solid learning objectives were extremely helpful in providing a foundation on which I could build my instructional approach. Overall, the application evolved to be an instructional tool of informal learning instead of a fun scavenger hunt game with some potential flares of learning!

Are all the phases relevant?

The fours phases of instruction, although implicitly integrated in most of my previous educational software endeavors, were useful in forcing me think in terms of specific learning goals to be met. Despite the fact that not all of them seemed pertinent for the informal type of learning of the Nature Explorer (it was hard to see the role of practice and assessment at the beginning), they started to fall in place as soon as I started working in depth on the desired learning outcomes. Those four stages provide a framework when thinking about the objectives of any type of educational software. Similarly, the role of motivation was equally important in leveraging all these design elements that are considered engaging by children (e.g., role play, physical motion, social play, free exploration, etc.) and producing a fun learning experience. This was imperative even more in the context of this space where curiosity, challenge, and fantasy can be fostered in a way to promote discovery learning... and this is what NE is trying to achieve.


(Play)testing makes you better!

The best way to get valuable feedback in interaction design is to test with actual users, and that what I did in the next stage of the project. This is even more the case when developing learning software, where the role of testing is double: identify usability problems but also the learning effectiveness of the tool under development. Since these two aspects are intertwined it is hard to identify the latter disconnected from the former; i.e., problems in using the software prevent users from using it effectively and thus achieving the desired learning outcomes. Playtesting allowed me to identify some serious design flaws (e.g., how to scan a code, or how to identify one in the museum space) that I could not have predicted and would have, most probably, hindered the learning process. If the time allowed I would have preferred to play test a second, more polished prototype and give emphasis to learning assessment this time.


Usability remedy

The results of the playtesting have eventually paid off by revealing the weaknesses of the system mainly in supporting children in their learning process. As a designer enacting multiple roles you are doomed to make assumptions about how the tool is going to be used; this is even more the case if you don't have the necessary support from other experts (for more details see my previous post). Usability problems that were identified helped me understand the points where users needed elaborate explanations and/or some scaffolding in order to build a mental model of the system, in order to be able to follow its progress smoothly. Additionally, feedback from the instructors and peers was another source of potential issues that I decided to address to prevent future interaction glitches.


Transferring knowledge

One of the most important aspects of every learning experience is how to transfer the acquired knowledge to other domains and experiences. Firstly, the most significant gain was the unique experience of working for such an informal space where there are seemingly not enough guidelines on what needs to be achieved. Learning goals were quite ill-defined and I had to create the instructional objectives based on my understanding on the space's needs.
This was an interesting design challenge which can be widely applied in other domains where clients think they know what they want, but you still have to elicit their actual needs. 

Secondly, due to the space's irregular visit schedule and the sensitivity of the audience's age it was not feasible to observe and discuss about "users' needs" in a meaningful way. Although normally this is the case in user study techniques like contextual inquiry where you have to be temporarily integrated in the environment in order to understand the current work practice, there are instances where this is not feasible (e.g., working with private and secure data) and you have to find workarounds. Working for such a young target group gave me the opportunity to devise alternative methods to understand their needs (e.g., the playtesting session). 

Finally, the lack of available (domain and content) experts or their inability to respond promptly to my requests forced me to take over some roles in order to facilitate development. This is often the case when designing in the real world, where clients might not always appreciate the importance of such experts and the designer has to bear some of this burden.

Overall, working for such an open-ended project, with loosely-defined needs, "inarticulate" users, and non-existent instructional objectives was a great opportunity for me to test my -instructional and interactive- design abilities. I had to impose some design guidelines and constraints based on my understanding of the problem and test my assumptions during an iterative, rapid prototyping process. The challenge might have been unique in this situation, but the resulting process was an invaluable tool for future design experiences.

1 comment:

  1. We'll be giving more feedback, of course, but I just wanted to comment that you provided some good insight into your process and points of growth. In all honestly, I hope this open-ended and ill-defined experience was a push in the right direction for you in order to understand design and development of products for this very type of user (perhaps one of the most difficult). As a former teacher and "professional" in instructional technology, I am particularly excited that you see the importance of identifying instructional objectives BEFORE you begin thinking of the technology. I see how this mistake is made so often and the result is usually not favorable. Thank you for including that in your reflection. These are all exciting lessons to read about! Please continue to be a reflective developer as it appears, on the outside anyway, to be something very valuable in your practice.

    ReplyDelete