What do you think of when you hear the word ‘literacy’? Reading, writing, and arithmetic? These three established literacies are regarded as an integral part of the education curriculum in every country across the world. They’re also considered essential skills that help children progress through education and master life skills. But what about digital literacy?
What is digital literacy?
The notion of digital literacy has actually been around for quite some time. Traditional definitions tend to be more hardware-centric, emphasising the mastery of IT equipment. However, back in 1997, Paul Gilster noted that digital literacy is a special kind of mindset “about mastering ideas – not keystrokes.”
Modern definitions have taken this idea into consideration, defining digital literacy as:
- The skills and knowledge to access and use a variety of hardware devices and software applications.
- The adeptness to understand and critically analyse digital content and applications
- The ability to create with digital technology (Media Awareness Network, 2010).
This definition goes beyond skill-based competencies, emphasising the importance of critical thinking and problem-solving. Given that most modern workplaces are technology-rich environments, digital literacy ultimately prepares students for applying the critical thought and problem-solving they develop in school in a different context.
Why does digital literacy matter?
We live in an age where technology is ubiquitous. Digital literacy is no longer just about becoming more employable; it’s about equipping students to navigate the demands of life and reach their potential as individuals. Most modern jobs involve interaction with some type of technology.
Researchers Katz and Macklin (2007) suggested that digital literacy is often taken for granted by employers, which puts those students and adults with lower abilities at an immediate disadvantage when applying for jobs and securing university places.