3DWorkSpace (b)log II – evaluation is key

Hugo Huurdeman, Jill Hilditch, Jitte Waagen

Why evaluation matters

From the outset the 3DWorkSpace project has focused on the evaluation of the platform. Why? As a multi-user environment that ultimately should benefit education and research by enabling tools to learn about 3D datasets, it is of paramount importance to understand how well our platform actually performs in this respect. This information is extremely important to check usability, fine-tune functionality, discuss its potential for implementation in education and research and, of course, eventually to assess our efforts and investments and determine future directions.

From the start, we envisioned that we would like to find answers to various questions, such as:

  • To what extent does the platform support easy interaction with 3D datasets?
  • To what extent does it encourage more in-depth engagement with 3D datasets?
  • To what extent is 3DWorkSpace a viable platform for education or research?
  • What are regarded as the most useful features of the platform?
  • What potentially useful features are not yet incorporated?

To this end, we designed an evaluation strategy that includes a full range of potential users, from researchers to teachers to students, as well as a selected group of experts in the domain of ICT and education, digital archaeology and also the 3D Program team of the Smithsonian itself – the creators of the Voyager 3D toolset integrated into 3DWorkSpace.

Fig 1. 3DWorkSpace models page

Evaluation design: a plan in two parts

In order to collect useful evaluation data, we designed both quantitative and qualitative evaluations.

The first part was an expert evaluation that was planned rather early in the development process, after having a first concept of the 3DWS platform online, but before the final phase of development, bug-fixing and fine-tuning. We invited a combination of hand-picked and suggested reviewers that were selected based on their respective expertise within and outside archaeology and with different specializations. We asked our reviewers to follow a structured online survey (using Limesurvey) that included a project aims statement and a set of dedicated screencasts running them through the 3DWS platform at its state of development at that moment. After a round of demographic questions, the participants engaged with the platform themselves. Subsequently, they filled out additional questionnaires about their experience in using 3DWorkSpace. These involved the System Usability Scale (SUS), a validated usability survey (Brooke, 1996) as well as a set of more qualitatively oriented questions.

Six experts participated in the usability survey, which means that we could meet the generally accepted minimum number of participants to generate a quantitative SUS-score (see e.g., Virzi, 1992). The subsequent qualitative questions were aimed at generating expert opinions on whether the 3DWS platform was fit-for-purpose, as well as receiving feedback on implementation possibilities and detailed information on potential shortcomings. In this way, we hoped to gather useful information from a group with a broad perspective and good insight-knowledge on 3D, digital methods and heritage.

The second part of the evaluation was a focus group with students, i.e., one of the intended end-user audiences. This was organized in a later phase of the project using a more refined version of the 3DWS platform, which contained elaborate examples of learning pathways (specific sequential guided activities aimed towards achieving competence) using the 3D model collections within the platform. Participants were students that responded to an advertisement for participation. As the results from the evaluation's introductory questionnaire indicate, these students had varied experiences with 3D datasets in the course of their studies. During the focus group session, the students were introduced to the project and the platform, watched screencasts of 3DWorkSpace, and were then presented with a case study on forming traces in pottery production in antiquity. Finally, they completed similar usability questionnaires as the experts. More importantly, we had a plenary discussion on the 3DWS platform to get their perspective. The nature and role of the learning pathways presented to the students are the focus of Blogpost 3 for the 3DWS project.

Fig 2. Screenshot of the Voyager app

Results expert assessment

The expert participants in our survey formed a diverse group in terms of their fields of study (Earth Sciences, Archaeology, Heritage Management, Computer Science, Information Science and Engineering), but most participants had obtained postgraduate qualifications in their chosen fields and, ultimately, were working under the purview of applications in computer science/visualisation or archaeological research. After trying out the 3DWorkSpace toolset, the participants first filled out the SUS, which consists of a set of 10 questions (for instance: "I think I would like to use this system frequently", and "I found the system unnecessarily complex"). From the answers, given on a scale of 1 to 5, the final score for the System Usability Scale is calculated. This score is measured in a range of 0 to 100, where a minimum score of 52 is considered "OK" and a score above 70 is considered as "Good" (see e.g. the empirical evaluation by Bangor et al., 2008).

The System Usability Scale score for the 3DWorkSpace platform was 74,64, based on the questionnaires completed by our group of six experts. Thus, the usability of the platform may be considered as good, even though individual questionnaire items indicated further potential for improvement.

This was further explored in the next part of the questionnaire, with specific open questions about the usability of 3DWorkSpace. Participants indicated that their overall experience with 3DWorkSpace was "positive", that the system was "easy to navigate" and "made the artifacts and models tangible". Specific features, such as the learning pathways, annotation features and possibilities for collection making were generally seen as the most useful features of 3DWorkSpace. The 'live' 3D model views were also seen as a useful addition, even though this according to several participants resulted in slowdown of their browser, due to the needed hardware resources for showing multiple 3D viewer panels simultaneously. Some concerns occurred regarding the workflows (for instance when adding learning pathways) and the ability to change other people's collections in the current prototype. A number of concrete suggestions for improving features were provided, for instance adding more visual elements to the now largely textual learning pathways. Additionally, feedback for improving the user interface and user experience was given, pointing at the naming of functionality, the contents of menus and the organization of features. These suggestions provide a wealth of useful feedback for future improvements of 3DWorkSpace.

The next part of the survey looked at the purposes of the 3DWorkSpace project and the ability of the created tools to meet the project's goals: develop an online platform for interacting with 3D datasets and explore its potential to offer structured guidance, stimulate discussion and advance knowledge publication. Generally, the tool was deemed by the experts as appropriate for reaching the project's goals. One participant mentioned that it "certainly makes it easier to engage with 3D datasets through the viewer and the rich annotation and documentation system". Another referred to the possibility of allowing "multiple people to create their own annotations and interpretations of the same datasets" as a crucial element. This was underlined by another participant: the tool facilitates "co-creation of and transfer of knowledge", in both didactic and science dissemination contexts.

Specific observations made by the experts on the placing and visibility of models, information texts, additional hyperlinked content and more, were useful for considering how to maximize engagement with the integrated datasets. One expert asked if gamification of the learning pathways (questions and scoring) might encourage wider or more in-depth engagement with the 3D models, in educational and heritage-based contexts. Another comment bridging usability and engagement potential suggested including ‘info-tips’ to briefly show the functions and capacities of the tools on offer, or prompts to remind users of the different ways they could interact with the models and collections. Further, it was emphasized that the structured guidance contained in the learning pathways needed to be tested more systematically in a pedagogical context. A first step in this regard will be described in Blogpost 3 focusing on the learning pathways in 3DWorkSpace and their evaluation.

Many comments looked ahead to scaling up the 3DWS prototype and raised concerns regarding data integrity, and data visibility for different user groups. Maintaining the integrity of uploaded collections with curated annotations and navigation was a key issue for considering reuse of the models and collections, as well as publication rights. The potential to develop different user profiles with greater or lesser powers of editing and to restrict access to collections containing unpublished 3D models among only authenticated collaborators were also suggested as future avenues for safeguarding integrity issues on the platform.

Overall, the broad interdisciplinary appeal of the 3DWS platform was commented upon, where any sharing or inspection of 3D datasets holds importance for moving knowledge and collaboration forward (such as medicine and geosciences, among others). The commenting feature and ability to add notes on the 3D models was also found to open up important space for new dialogues and knowledge sharing, bringing future appeal to such a platform across a broad range of contexts.

Fig 3. 3DWorkSpace - Learning Pathway

Conclusion and discussion

This blogpost outlined one of the key focal points of the 3DWorkSpace project: evaluation. An expert evaluation of the platform resulted in a SUS usability score of 74, which represents “good” usability. In addition, the qualitative parts of this study showed many positive aspects, for instance the ease of navigation and the 3DWorkSpace platform’s facilitation of co-creation of knowledge. Naturally, also potential points for improvements were identified, for example regarding editing workflows and technical aspects of the platform.

In the next blogpost, we will shift our focus to the evaluation conducted with students and discuss the nature and role of the learning pathways introduced in 3DWorkSpace.

References

Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the System Usability Scale. International Journal of Human-Computer Interaction, 24(6), 1–44. https://doi.org/10.1080/10447310802205776

Brooke, J. (1996). SUS: A “quick and dirty” usability scale. In P. W. Jordan, B. Thomas, B. A. Weerdmeester, & A. L. McClelland (Eds.), Usability Evaluation in Industry. Taylor and Francis.

Virzi, R. A. (1992). Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough? Human Factors, 34(4), 457-468. https://doi.org/10.1177/001872089203400407

3DWorkSpace (b)log I – from idea to platform

Jitte Waagen, Hugo Huurdeman, Jill Hilditch

Introduction: 3D datasets and open science

In the field of material heritage we see an exponential increase in 3D datasets which provides the scientific community with interesting new possibilities. One of these possibilities is to share 3D models using online platforms, making use of so-called ‘3D-viewers’. Such platforms can really add value to 3D datasets, because they allow for presentation of scientific data in real-world dimensions, provide the possibility of annotating the models, and often feature tools to interact with the models. All these factors increase the impact of 3D datasets by making them insightful and creating a versatile medium for communicating in-depth knowledge on those datasets. Some useful platforms have already been developed specifically for archaeological collections, such as Dynamic Collections, a 3D web infrastructure created by DARK Lab, Lund University (https://models.darklab.lu.se/dynmcoll/Dynamic_Collections/) or PURE3D (https://pure3d.eu/), focusing on the publication and preservation of 3D scholarship.

However, the availability of online 3D datasets on such platforms also presents challenges: 3D datasets can be both complex to understand and interact with, and presentation tools often lack possibilities for bi-directional knowledge transfer, which can mean that the insights and narratives generated from the user-perspective are difficult to integrate and often ignored. From an Open Science perspective, it would be very interesting if platforms such as these would provide novel tools to create a rich multi-user workspace. These might include creating your own versions of 3D models and personal annotated collections, as well as multi-author 3D models or collections, and tools to enable discussions on those models and collections, and creating 3D-illustrated learning pathways.

The project that we introduce here, 3DWorkSpace, is an Open Science project funded by NWO (https://www.nwo.nl/projecten/203001026, see also the project announcement https://4dresearchlab.nl/3dworkspace-project-announcement/) and led by Jill Hilditch, Jitte Waagen, Tijm Lanjouw and Hugo Huurdeman. The goal of the project is to develop an online platform for interacting with 3D datasets and explore its potential to offer structured guidance, stimulate discussion and advance knowledge publication. This project is not so much aimed at creating yet another platform, but is intended as a pilot study towards the direct combination of realizing a platform and presenting case studies that will explore its potential, benefits and shortcomings. These case studies are focused on both deployment in the classroom as well as for peer-interaction in a research and professional context.

Voyager

The 3D viewer technology is something quite different than the eventual user-facing platform in which you’d like to integrate it. Depending on the case, building a viewer from scratch might not be a good idea - especially when many good examples already exist. Since our goal was to explore the potential for creating the platform and to evaluate that, we decided to work with  an existing viewer. In our explorations, both within the 4D Research Lab as well as in the Tracing The Potter's Wheel project (https://tracingthewheel.eu/), we evaluated various 3D viewers and 3D web technologies, such as 3DHOP (https://3dhop.net/), Aton (https://osiris.itabc.cnr.it/aton/), and Potree (https://potree.github.io/). Each of these has its specific benefits and drawbacks in terms of features, usability and technology, but eventually we chose Smithsonian Voyager (https://smithsonian.github.io/dpo-voyager/), an open-source 3D toolset. We found especially attractive the focus of Voyager on providing both a web-based 3D model viewer (Voyager Explorer) and an extensive authoring tool (Voyager Story). This authoring tool allows a user without specific technical experience to enrich 3D models via a web browser. A user can add, for instance, annotations as well as articles and combine these into tours. These enriched 3D models can, requiring some technical expertise, be subsequently published by integrating Voyager Explorer into a website. Given this capacity, Voyager ticked quite some boxes on our wishlist. A final important benefit of Voyager is that behind its development are professionals working hard to bring their product to as many users as possible and increase flexibility. Direct communication with the 3D Program team of the Digitization Program Office of the Smithsonian has been of fundamental value to the 3DWorkSpace project.

Having decided to use Voyager as the 3D-viewing building block of our platform, we turned to designing a platform in which it could be integrated, allowing us to reach our goals related to the open science approach of multi-authoring, learning and discussing. The challenge was to not fall into the trap of ‘featureism’, i.e. thinking up as many cool features as possible to integrate into the single most fantastic tool. This approach could lead to potential issues, including usability problems and implementation difficulties. Instead, we opted for a theoretical and methodological discussion which led to a baseline set of features that would facilitate the type of use and case studies that we were working towards. Thus, in addition to the basic browsing and search functions of such a platform, users should be able to:

  • create and use their own 3DWorkSpace account (user authentication)
  • compare 3D models side-by-side using multiple viewer panels (comparing models)
  • annotate specific 3D models (annotating models)
  • create and save personal or public collections of 3D models (collection making)
  • add basic metadata to collections (describing collections)
  • add comments to collections and reply to comments (discussing collections)
  • create learning pathways for collections, incorporating textual content and hyperlinks to custom views of specific models (creating collection learning content)

 

Components of the 3DWorkSpace platform

The final 3DWorkSpace platform integrates the basic features we defined, such as collection making, annotation of 3D models and detailed discussions about collections. These features were implemented using three main elements: a storage server for 3D assets, the 3D viewer and authoring tools, and the 3DWorkSpace system itself.

The first crucial element of 3DWorkSpace platform entailed the storage and retrieval of the required 3D assets. These assets include 3D models, but also related annotations and additional metadata about the models. As we aimed for creating a bi-directional platform, these files had not just to be statically stored, but also dynamically editable. Voyager directly supports the WebDAV-protocol (https://en.wikipedia.org/wiki/WebDAV), which allows for editing files directly on a web server. Therefore, this WebDAV-server provided the foundation of the 3DWorkSpace platform.

Second, we integrated the Voyager toolset into the 3DWorkSpace platform. Specifically, we made use of two elements of the toolset: Voyager Explorer, the web-based viewer for 3D models, and Voyager Story, the authoring tool for creating the necessary files to display 3D models together with contextual information in Voyager Explorer (using Voyager’s structured SVX-format). Enrichments created using Voyager Story were automatically saved on the previously described storage server and could be visualized using the Explorer element.

Finally, the third crucial element was the 3DWorkSpace system itself, which seamlessly integrated the Voyager tools. The system made use of the Firebase app development platform (https://firebase.google.com/) for features such as user authentication and associated databases. The user interface (‘front-end’) was created via the React-framework (https://react.dev/), a framework to create interfaces using individual pieces (named 'components'). An advantage of React is that created components are highly adaptable, resilient and reusable, further contributing to the goals of the Open Science program 3DWorkSpace is part of.

3DWorkSpace overview 

The three discussed components led to the platform illustrated in Figure 1, 2 and 3. Users can browse and search models, collections and learning pathways. A unique feature of 3DWorkSpace is that users can always directly interact with the 3D models; in search results list, collections as well as detail views (Figure 1). The addition of multiple models makes directly comparing features of models possible, which is potentially useful for education, research and professional purposes.

Within the detailed views of collections (Figure 2) users can view and interact with associated 3D models, for instance by rotating models or by toggling visible annotations. Logged-in users can also edit 3D models and metadata using Voyager Story. These edits are directly saved on the WebDAV-server providing storage, offering a seamless experience.

On the right-hand side of a collection, various tabs allow for inspecting and editing collection metadata, notes and comments, as well as learning pathways. These features allow for unique possibilities in terms of bi-directional knowledge transfer: for instance, discussions with peers or teachers. Furthermore, learning pathways (Figure 3) allow for directly linking learning content with specific model views, such as close-ups of forming traces on ceramics. In this case, the multi-model view also allows for direct comparisons. Learning pathways will be further discussed in Blogpost 3.

Challenges, solutions and future work

While the 3DWorkSpace platform prototype provides various novel features, a number of challenges arose during its design and implementation, including user roles, potential system requirements and the authoring of enriched 3D models.

User authentication is an important issue. In the prototype version of the 3DWorkSpace platform, users can register and log-in to access commenting and editing features. However, there is no differentiation between roles; any user can directly edit or even delete any model, collection or associated data. In a future version of the platform, different user roles should be distinguished, to include administrators (having full editing access), editors (upload and edit models or collections) or commenters (only being able to comment on collections). This is especially important for use of the platform within educational settings.

The unique feature of displaying multiple editable models on search result pages facilitates model comparisons, but also resulted in issues with regards to high memory usage; a potential issue for users with older or limited computers. We resolved this issue by including only six models on any given page (e.g., in a search result list). In future work, model display via Voyager can be further optimized, for instance by initially showing low polygon-versions of models, or by showing thumbnails of models which only load after clicking on them.

A final challenge was the inherent complexity of the authoring tool Voyager Story for enriching 3D models with metadata, annotation and tours. Voyager Story has many in-depth features which make it an incredibly useful tool. However, this leads to some difficulties for initial users of Voyager Story due to its complexity. It was not feasible to resolve this within the scope of 3DWorkSpace, but we were able to alleviate it by creating extensive screencasts explaining the authoring process.

Conclusion

We hope with this blogpost to have provided you with some insights into our ideas and how they steered the development of 3DWorkSpace. We will comment on the platform evaluation and practical case studies in the next few blogposts!

 

Tracing the Potters Wheel

 

 

 

 

 

 

Team

Project Lead

  • Jill Hilditch - ACASA
  • Jitte Waagen - ACASA / 4D Research Lab

Concept, development, evaluation

  • Hugo Huurdeman - Open Universiteit
  • Tijm Lanjouw - 4D Research Lab
  • Caroline Campolo-Jeffra - Immersive Heritage

Technical development

  • Markus Stoffer - 4D Research Lab
  • Ivan Kisjes - CREATE
  • Saan Rashid - CREATE

Funded by NWO Open Science Fund (203.001.026)

 

 

 

Screenshot of the Voyager app

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Fig 1. 3DWorkSpace models page
Fig 2. 3DWorkSpace side-by-side comparison on a collection page
Fig 3. 3DWorkSpace - Learning Pathway