Assessing Digital Literacy: Developing a General Measure

assessing digital literacy

Although it’s not practical to create a single measure for assessing the digital literacy of workers at all levels and in all functional areas, it is possible to create a general measure to assess basic competencies. Here’s our take on developing a framework for creating a digital literacy assessment tool. It includes four components (from concepts to skills and tactics) and three focal areas (communication and collaboration, cybersecurity, and the law and ethics). (Written with John T. Miller II, PhDc)

In January 2015, ATD offered the webinar, Helping Learners Learn in the Digital Era, during which participants asked a number of thought-provoking questions. In addition to inquiring about how they as learning professionals could increase their own digital literacy, they wanted to know how to make a strong business case and sell leadership on the importance of a digitally literate workforce, create institutional support for digital literacy initiatives, and motivate individuals to become more digitally literate.

Underscoring all of these questions were questions about assessing digital literacy, like this one:

Are there particular assessments to get a sense of skill set for a diverse organization – from production line to R&D Engineers to Sales and Marketing – does everyone need the same skill set? (Unedited)

And this related question about ATD’s potential role in developing some kind of tool for assessing digital literacy:

Would appreciate any leadership role ATD can take in expanding the availability of assessments for technology to include both the knowledge and skill levels to go beyond the baseline competencies. (Unedited)

Although it’s not practical to create a single measure to assess the digital literacy of workers at all levels and in all functional areas, it may be possible to create a general measure to assess basic competencies. Here’s our initial take on developing a framework for creating a digital literacy assessment tool. We welcome feedback from others to develop these ideas.

Defining Digital Literacy

Before designing a way to measure digital literacy, it’s important to define it. Both the American Library Association and Wikipedia provide solid working definitions of digital literacy. Here is Wikipedia’s definition:

A digitally literate person will possess a range of digital skills, knowledge of the basic principles of computing devices, skills in using computer networks, an ability to engage in online communities and social networks while adhering to behavioral protocols, be able to find, capture and evaluate information, an understanding of the societal issues raised by digital technologies (such as big data), and possess critical thinking skills.

Although definitions like this are focused on digital literacy in a global sense, the core concepts can easily be applied to the context of work.

Assessing Digital Literacy – Four Components and Three Focal Areas

Workplace digital literacy can be construed to be comprised of four hierarchical components. The first three focus on basic knowledge and understanding, as well as organizational and individual applications. The fourth focuses on related skills and the ability to leverage digital technology effectively.

  • Digital era concepts. Focused primarily on job-related communication and collaboration, these include things like platforms, channels, content creation and curation, crowdsourcing, cloud computing, and cybersecurity.
  • Digital tools and systems. Digital tools include the obvious: email, chatting/instant messaging, the Microsoft Office suite of products (and equivalents), as well as tools like photo and video editors. Systems include software applications developed for specific purposes, like accounting, business intelligence, and learning management.
  • Social technology features, platforms, and tools. Social technology features include things like blogging, customized aggregators, dashboards and portals, discussion forums/threads, media sharing, user-generated profiles, and wikis. Platforms and tools include obvious public networks like LinkedIn, Twitter and YouTube, but also tools like Disqus and ShareThis and more privately-oriented offerings like Yammer, Jive, and Interact Intranet.
  • Digital engagement skills and tactics. This component focuses on the skills to use social and digital technologies efficiently, as well as the necessary judgment to use them effectively. Examples include knowing the right channel to use for a given communication, using email productively, creating and engaging properly in discussion threads and forums, content curation and validation, contributing to a wiki, and HTML basics.

In considering how to develop a general workplace measure for assessing digital literacy, we’ve identified three focal areas that are primary areas of concern for employers.

  • Communication and collaboration. The ability to communicate and collaborate with others using digital technology is critical in enabling an organization to function both efficiently and effectively. This includes internal communication with colleagues and external communication with clients and others.
  • Cybersecurity. Individuals have generally been found to be the weakest link in protecting an organization in cyberspace. Understanding the risks and engaging in the right behaviors creates a strong first line of defense against hackers, viruses and other digital threats.
  • The law and ethics. Workers now have added responsibilities with respect to things like protecting an organization’s brand, intellectual property and trade secrets; maintaining proper levels of confidentiality; and ensuring the privacy of clients, fellow employees, and other stakeholders. These responsibilities apply not just to their digital engagements while on the job, but can also extend to their non-work activities.

Building a Measure for Assessing Digital Literacy

Any tool for assessing digital literacy should include questions that address the relevant components in the context of each of the three focal areas. Here are some examples.

Communication and collaboration

  • Digital Era concepts: What is an enterprise social network? What is crowdsourcing?
  • Digital tools and systems: What is the main difference between texting and chatting/instant messaging?
  • Social technology features, platforms, and tools: What are the basic elements of a user profile?
  • Digital engagement skills and tactics:
    • You’d like to start a dialogue with some colleagues on a specific topic. What is the best digital means for creating and conducting that conversation?
    • What are the basic steps for adding a hyperlink to text or an image in a Microsoft Word, PowerPoint, or Excel file?
    • Is it okay to express strong negative emotions via digital channels?

Cybersecurity

  • Digital Era concepts: What is phishing? What is hacking? How is malware different from a virus?
  • Digital tools and systems: What is the primary way an individual can expose an organization to malware or a virus?
  • Digital engagement skills and tactics:
    • If you suspect your computer has been infected by a virus or malware, what’s the first thing you should do?
    • What’s a good password to use on your mobile device?
    • If you access work systems remotely, should you make sure you’re on a secure channel first?

The law and ethics

  • Digital tools and systems: How can trade secrets be leaked via email?
  • Digital engagement skills and tactics:
    • If you’re discussing the organization or one of its brands on a public social network (for example, LinkedIn), should you disclose your working relationship?
    • Is it appropriate to discuss clients on your personal social networks (for example, Facebook), even if you don’t name names?
    • If you suspect a confidentiality leak, what is the first step you should take to report the loss or compromised information?

Design considerations implied by the above examples include:

  • Questions can be true/false, multiple-choice, and open-ended.
  • The assessment can include specific skill exercises, such as sharing a link to a web page or starting a discussion thread.
  • The assessment should probably be weighted toward skills and tactics, including assessments of judgment, etiquette, and ethics (i.e., what’s the right thing to do in a particular situation).
  • To the extent possible, questions should be contextually independent, although different versions of the same question may need to be created to adapt to different browsers (Internet Explorer or Chrome) and operating environments (Windows or Mac). Some answers may need to be tweaked to reflect an organization’s unique policies, procedures, and practices.
  • Questions should have clear right/wrong answers to enable the creation of strong cutoffs between different competency levels, as well as to establish baseline measures and standards.

In addition, to motivate people and make the test more palatable, the assessment could take the form of a digital scavenger hunt, with gamification elements to reward people for correct answers and accomplishments.

Your Thoughts?

Creating a tool for assessing digital literacy is certainly possible, but it will not be easy. There are a lot of nuances associated with the measurement components described herein, and lots of details to be addressed, not to mention all the testing necessary to ensure the tool is both valid and reliable.

But let’s stick with design for the time being. Does this first cut at creating a general digital literacy assessment make sense to you? What other components should be included? What other focal areas should be considered? How would you tweak the design?

We welcome your feedback and ideas.

A version of this article was previously published on the ATD Career Development Blog.

Send this to a friend