Abstract
Daily life is indelibly marked by the media, which create (or help to create) different forms of leisure, entertainment and work. Reading, analysing, decoding, understanding and interpreting the media and the content they convey, as well as creating and producing messages, are today fundamental competences required to deal with and manage the flows of communication and information that reach us through various media and platforms. The following study involved administering an online questionnaire to a sample of 679 Portuguese students, mostly between the ages of 17 and 18. The aim was to characterize their media access and use in their final year of compulsory education, to determine their knowledge of the media, as well as their analysis, interpretation and production competences, which were placed on a three-level scale of media literacy. Findings suggest that knowledge is at a basic level, in most cases at the lowest level of the scale.
Introduction
Literacy, a termed coined in the 19th century ‘to express the achievement and possession of what were increasingly seen as general and necessary skills’ (Williams, 1983: 188), was initially associated to the basic skills of reading, writing and arithmetic. These dimensions prevailed for most of the 20th century and provided the structure for the first Portuguese national literacy study, which defined it as ‘the ability to use the skills (taught and learned) of reading, writing and arithmetic’ (Benavente et al., 1996: 4). However, as society became more complex, the concept itself evolved beyond the understanding of ‘words or numbers strung together’ (Moeller et al., 2011: 10). The wealth of information and the proliferation of forms of communication brought about new challenges which society had (and has) to respond to, resulting in a need for new literacies, including media literacy (Livingstone et al., 2008; Pinto et al., 2011; Potter, 2010). Although Potter (2010) states that there is no single concept of media literacy, it does seem to comprise three essential dimensions: access and use, a critical understanding of the multiple facets of the media and the production of content and the participation in and through the communication media. It is in this sense that the European Commission Recommendation (Recommendation 2009/625/EC) defines media literacy: ‘the ability to access the media, to understand and critically evaluate different aspects of the media and media content and to create communications in a variety of contexts’. This was the concept which Pereira et al. (2015b) drew on in their study to ascertain the media literacy competence levels of a national sample of young people, mostly between the ages of 17 and 18, attending Portuguese public schools. This article examines some of the findings of the study, focusing on the data on the group’s media competences. It also seeks to discuss the method and the importance of assessing media literacy knowledge and competences, highlighting both the merits and the problems of the scale that was used.
The concept of competence and media literacy competence
Despite the popularity of the concept of competence, it does stir up controversy. Ferrés Prats (2007) and Guzmán Marín (2012) consider the business world/labour market as the source of the term, having later been adopted by academia and education policymakers. When discussing competence, one is not therefore referring to a stable and widely accepted concept. The proliferation of terms frequently considered as synonymous (capacity, knowledge, ability, etc.) and the diverse fields in which it is employed make it difficult to define (Drexel, 2003; Guzmán Marín, 2012; Lopes et al., 2015; Perrenoud, 1995; Sultana, 2009).
For Perrenoud (1995), competences are more than know-how or performance, they are ‘a problem-solving strategy relying on reasoning, inferences, foresights, assessing the probability of different events, reaching a diagnosis based on a set of indicators, etc’ (p. 21).
According to the author, the development of competences allows ‘different types of knowledge to be related with each other and operationalised in complex situations’ (Perrenoud, 1999: 16) and to be employed in different contexts.
When it comes to defining media literacy competence, Ferrés Prats (2007) and Fastrez (2010) make an important contribution. For the former (Ferrés Prats, 2007: 103), a competence relates to the concepts, procedures and attitudes connected with six key dimensions (language, technology, production and programming processes, ideology and values, reception and audience, aesthetics). For the latter (Fastrez, 2010), there are four tasks – reading, writing, navigation and organization – and three dimensions – informational, technical and social. The articulation of the tasks with the dimensions generates ‘a twelve-cell matrix establishing a set of media literacy competences’ (Fastrez, 2010: 36). Despite not joining the debate on what constitutes competences, Fastrez (2010: 36) associates them to the actual definition of media literacy, which can be considered as the set of competences displayed by an individual who is able to make critical and creative assessments, in an autonomous and social manner, in the contemporary media environment.
In sum, a discussion on competences entails discussing the different resources that are mobilized to address both new and familiar situations (Perrenoud, 2000). Among these resources are analytical and critical reading skills, as well as questioning and transference abilities. This concept is in line with the importance traditionally accorded in the field of media literacy to the critical understanding of the media and their messages (Potter, 2010) and, as such, informed the study on the media literacy levels of young Portuguese people (Pereira et al., 2015b), which sought to determine their analytical, critical understanding and production skills involving the media, and cross-referenced them with media uses and practices. By developing this work, we also intend to discuss why media literacy matters and to reflect on how it can make a positive contribution to teenagers’ personal, social and cultural development and to their understanding of the world. Youngsters are growing up with media and technology, and they need to be able to deal with all these means, facing risks and taking advantage of the opportunities they can offer. Media literacy involves the promotion of a range of competences that are increasingly necessary and valued by school, workforce1 and society dynamics. This highlights the importance of assessing media literacy competences.
Assessing media literacy competences
Media education and literacy, the former understood as the process towards higher levels of the latter (Pinto et al., 2011), have been around for decades, albeit not in a univocal way (Buckingham, 1998; Potter, 2010). The recognition of their importance is almost common sense: both are nowadays key dimensions ‘of communications policy across a range of European countries’, although seldom transposed to formal education (Wallis and Buckingham, 2013: 528). Assessing and measuring media literacy is an issue that is still clouded by uncertainties, much due to a lack of consensus and, of course, the intrinsic complexity of the concept. According to Livingstone et al. (2012), ‘there remains little agreement about media literacy or how to measure it and, therefore, little evidence that efforts to improve it are effective’ (p. 4). Ferrés Prats et al. (2011) state that it has become a vicious cycle: ‘media education competences are not assessed because they are not taught. But, at the same time, they are probably not taught because, as they are not assessed, there is no awareness of the existing gaps in this area’ (p. 11).2
Despite this, Ashley et al. (2013) maintain that progress has been made. Lopes et al. (2015) looked into the work conducted in Portugal on the assessment of media literacy levels and determined that studies do exist, are growing in number and are all recent. In her PhD thesis, Paula Lopes (2013) points out that the assessment of media literacy levels has ‘been mainly theoretical-empirical and quantitative-extensive in nature, focusing actually more on practices than competences’ (p. 100). The author sought to counteract this trend by using an assessment tool whose aim was to test knowing-how-to act ‘through task or problem solving and through processes involving the assignment of meaning’ as is the case with the main extensive studies on literacy (Lopes, 2013: 1). A few years earlier, a nationwide study based on very similar principles had been conducted in Spain (Ferrés Prats et al., 2011): the levels of media literacy competence were assessed by means of a questionnaire administered as a test, operationalizing the concept of competence previously stated in Ferrés Prats (2007). The findings were then exploited using qualitative techniques.
At a European level, two studies prepared for the European Commission merit, highlighting the research coordinated by Celot and Pérez Tornero (2009) and its follow-up by the Danish Technological Institute (DTI) and the European Association for Viewers Interests (EAVI) (DTI and EAVI, 2011). Both sought to respond to the request of the European Commission: to develop a tool to measure media literacy levels in the different European Union (EU) countries. The first study (Celot and Pérez Tornero, 2009) attempted to outline an analysis framework to be employed in the task, having been tested (and later updated) in the second study (DTI and EAVI, 2011). Although this second study was based on a conceptual structure that included both individual and contextual/environmental factors, only the former were covered by the research tool used: a survey by online questionnaire administered to a sample of approximately 1000 individuals from each of the seven countries that participated in the study.
This experience in measuring media literacy levels made the difficulties associated to the task quite clear. To begin with, questionnaire-based research may be hindered by respondent fatigue; therefore, the research team suggested administering relatively short questionnaires in a non-academic language (DTI and EAVI, 2011: 101). Nonetheless, this research tool has to take into account the main dimensions of media literacy, and a balance needs to be struck between what one seeks to determine and what one can collect. This simplification is not straightforward given the complexity of the concept and the lack of consensus on what most needs to be measured (DTI and EAVI, 2011: 101). As regards the difficulties in ranking the samples by levels, it is stated that ‘there are no definite objective standards that can be used to meaningfully scale the various partial media literacy scores presented’ (DTI and EAVI, 2011: 71). The criterion laid down for establishing the levels was ‘common sense conclusions about the extreme of the low end of most scales’ (DTI and EAVI, 2011: 71). Thus, the levels were established based on ex-ante expectations and were later refined by the results, an adjustment which is common in the assessment of literacy levels (Benavente et al., 1996).
Choosing a tool such as a survey questionnaire was also discussed: ‘Since media literacy is part of everyday life and is associated with a variety of influences, contexts, and actions, surveys alone cannot provide a comprehensive assessment, but may provide a simplified indication of overall trends in media literacy levels’ (DTI and EAVI, 2011: 4). In Pereira et al. (2015b), an online survey was also used and the conclusion was reached that ‘there is still a long way to go as regards the most suitable methodologies and methods to assess media literacy competences’ (p. 92). Besides reflecting on ‘how’ to assess, one has to continue to reflect on the ‘what’ and the ‘why’. The complexity of media literacy and of the idea of competence itself, when understood as something more than the achievement of a given objective, leads to the question of ‘whether its measurement will not confine media literacy to merely functional and operational abilities’ (Pereira et al., 2015b: 92), sacrificing the study of a complex reality for the sake of obtaining a ranking.
Methods
Based on the scoping study conducted by Pereira et al. (2015b), the aim of this article is to present and discuss the findings pertaining to the media literacy competences of a sample of Portuguese teenagers, as well as to problematize the task of constructing a scale capable of identifying and assessing media literacy competences.
The study focused on students in the 12th grade of Portuguese secondary school, mostly between the ages of 17 and 18. As the 12th grade is the final year of compulsory education in the Portuguese education system, it was considered important to know the media-related competences of a group that would then move on to higher education or enter the job market.
To constitute the sample, the authors employed a non-probability quota sampling technique, according to the Portuguese NUT II (Eurostats’ Nomenclature of Territorial Units for statistics) geographical division, and the source was the list of students enrolled in the 12th grade in the 2011/2012 academic year provided by the Directorate General for Education and Science Statistics of the Ministry of Education and Science (Direção-Geral de Estatísticas da Educação e Ciência/Ministério da Educação e Ciência, Portuguese acronyms: DGEEC/MEC). For the purposes of the sample, an average of 20 students per class was considered. As the target sample was around 1000 students, this meant a total of 50 classes were surveyed. Within each NUT, the schools were selected based on convenience sampling, with the class quota per NUT being maintained. The option for convenience sampling was made due to an expected higher response rate in this type of method. The selection sought to ensure that the geographical diversity of each zone was kept so that it included schools from predominantly urban areas, from semi-urban areas and from predominantly rural areas, following the classification drawn up by Statistics Portugal (Instituto Nacional de Estatística, Portuguese acronym: INE). The final selection of the schools hinged on each school’s willingness to participate. Having chosen the schools, the next step was to establish the number of classes to be surveyed in each of the scientific-humanities courses (Science and Technologies, Languages and Humanities, Socioeconomic Sciences and Visual Arts), which was proportional to the total number of classes in each of these courses according to the data provided by the DGEEC/MEC.
Once the sample was determined, an online questionnaire was administered.3 It consisted of two parts: one on media use and practices, and the other was relatively similar to a knowledge test since most of the questions were exercises intended to simulate and stimulate decision making and application of knowledge. These exercises sought to bring out knowledge and skills as well as reflections, opinions and intuitions. The questions comprising the first part of the questionnaire were not used in the scale that will be presented next. It was considered that scoring the frequency questions was not consistent with the adopted concepts of media literacy and competence: higher frequency of use does not necessarily equate to a higher level of literacy. Emphasis was thus put on the questions addressing the second dimension of media literacy: critical understanding. Production and participation through the media were also assessed. Despite not being scored, the questions pertaining to access and use were used in the analysis of the three levels of media literacy that were defined. To identify statistically significant differences, non-parametric tests were used since the conditions for the applicability of parametric tests were generally not met. For the comparison of independent or non-related groups, the Mann–Whitney test (whenever two groups were examined) and the Kruskal–Wallis test (for more than two groups) were used. On indication that there were significant differences, they were identified by using the Mann–Whitney test to compare each pair. All the tests were performed with a 95% confidence level (p < .05).
In total, 26 questions were assessed.4 This assessment was based on a scale from 0 to 100 points. The questions fell into three categories each with a different score (10, 6 and 3 points), according to the level of difficulty (Figure 1).5
The scale had three levels, which were determined as follows: students placed in Level 1 had scores below the average (29.01 points), students in Level 2 had results between the average and the positive score (49.50 points were considered as the threshold for a positive score) and Level 3 students all had positive scores.
A total of 848 questionnaires were initiated while 679 were fully completed, which represents a 68% response rate. Of the 679 valid questionnaires, 59% were completed by female students and 41% by males. Unsurprisingly, the majority (86%) were between the ages of 17 and 18, while 9% were 19 years old. Considering the four scientific courses of the Portuguese non-vocational secondary education system – Sciences and Technology, Languages and Humanities, Visual Arts and Socioeconomic Sciences – most of the students (43%) were enrolled in the first one. Approximately two-thirds attended schools in the North and Lisbon regions. Roughly 87% intended to go on to higher education. The majority of the households (68%) in the sample resided in predominantly urban areas and had incomes mostly between €500 and €1000, and €1001 and €2000, with percentages around 29% and 26%, respectively.
Results
Media access and uses: A connected generation
The sample of this study is consistent with a connected generation. Practically all the students stated they had, either at home or personally, a television, a mobile phone (mostly smartphones) and a laptop with Internet access. In these three cases, access percentages are invariably above 95%. The Internet and television available in the households are mostly accessed through a wireless connection (92%) and by subscription (83%), respectively. The majority of the respondents stated they used these media recurrently: over 90% said they used the computer and the Internet either ‘always’ or ‘many times’; almost 77% reported these frequencies of use for television; practically all of those who own a smartphone (69% of the sample) stated they used it all the time or many times. Most of the respondents indicated that entertainment was what they were seeking when using each of these media. As for the self-stated less frequent uses, classified as ‘rarely’ and ‘never’, the more traditional media such as newspapers (57%), magazines (49%), radio (46%) and films in cinemas (41%) stand out.
Although the access figures for the aforementioned media were generally high, there were statistically relevant differences according to the characteristics of the household regarding the use of the Internet and smartphones.6 As can be seen in Table 1, the use of smartphones and, particularly, the Internet is high in all households. However, it increases significantly as the net disposable monthly income rises.7
|
Table 1. Frequency of use according to household net disposable monthly income.

In the case of the smartphones, both parents’ educational background appears to impact how regularly they are used: students whose parents only have basic schooling use smartphones significantly less than those whose parents went to secondary school or university.
The results achieved with regard to media access and uses do not differ significantly from what other national (Amaral et al., 2017; OberCom, 2014) and international (EU Kids Online, 2014; Ofcom, 2016) studies have pointed out. This group can therefore be characterized as an average media users group. Digital technologies are part of children and young people’s everyday life, introducing new forms of sociability in a cross-media consumption and interaction. The Unicef (2017) report ‘State of the World’s Children 2017: Children in a digital world’ confirms the massive presence of children in the online world, indicating that young people (15–24) are the most connected age group and stating that worldwide, 71% are online, compared to 48% of the total population. Also, as other studies have shown (Pereira et al., 2015a), today the remarkable differences among media users are placed more at the level of critical search, understanding, analysing and creating information than access and usage. In developed countries, the digital divide is today expressed above all at the level of skills to analyse, evaluate and produce content. Using the media does not in itself guarantee the development of media literacy competences, as the following points will confirm.
Media literacy levels and competences
By resorting to the scale presented above, it was possible to place in each student in one of the three levels created. The highest and lowest overall scores were 71.79 and 2.73 points, respectively. The distribution of the students by level is shown in Table 2.
|
Table 2. Student placement by level.

The group with the best results is the smallest, comprising just over 30 respondents. The highest score was 71.79 points and the lowest did not exceed the required minimum threshold (49.50). The average score of these 32 students was 55.97 points, with a standard deviation of 5.67.
The Level 2 group consists of 295 students, approximately 43% of the sample. Their scores ranged between 29.01 and 49.50 points, with an average of 37.05 points and a standard deviation of 5.52.
The group with the lowest scores is the most numerous. A total of 352 students (52%) did not exceed the average score of the 679 respondents. This group’s average was 19.82 points, with a standard deviation of 6.42. What further highlights the negative scenario in this level is the fact that 254 of the 679 respondents (i.e. 38%) were not even able to score 25 out of the possible 100 points.
In all of the 26 questions analysed, Level 1 students have significantly lower scores than those in the remaining groups. In 16 questions, the differences between Levels 2 and 3 are also relevant from a statistical standpoint, always to the advantage of the 32 students with the highest scores.
Who are they? Characterization of young people by media literacy level
Unsurprisingly, students in the three levels are of very similar age, with averages being around 17 years. Furthermore, the overwhelming majority intended to go on to higher education. In all three groups, as is the case in the aggregate sample, females are the majority. In Level 1, most of the students attended schools in the Lisbon region. In the other two levels, the majority of the schools are from the North. In fact, the differences in scores between schools in the North (average of 30.6 points) and in Lisbon (average of 27.8 points) are statistically significant. Despite these disparities, most of the young people lived in predominantly urban areas in all three levels. Finally, when comparing the declared net monthly incomes and parents’ education with the scores obtained, using the aforementioned test, no statistically relevant difference was found. The sociodemographic variables are detailed in Table 3.
|
Table 3. Sociodemographic characteristics of the three groups (Pereira et al., 2015b).

As far as access and uses are concerned, the scores are not very different from those of the overall scenario described above: television, Internet, computer and smartphones have very high dissemination and usage rates. There were, nonetheless, statistically significant differences in the use of the Internet and the computer: Level 1 students use them significantly less, albeit many times (averages of 4.5 and 4.4, respectively), than their Level 3 peers (average of 4.8 for both). Besides these differences, three other statistically relevant ones were found, all of which to the advantage of Level 3: the 32 students in this group read significantly more newspapers, magazines and printed books than their peers in the other groups.
Given the prevalence of the Internet and television, a few practices and uses in each of the levels should be mentioned. The most frequent online activities, with averages of 4 or above in all levels, are Internet searches, listening to music, using social networks (mainly to chat with friends, while content production or content sharing activities are much less frequent) and watching videos, which was the only activity where statistically significant differences were found with Level 1 students watching considerably less than the others. When it comes to television, series and films stand out. However, news programmes have a very high frequency in Level 3 (average of 4.3), being significantly higher than the other two levels, whose averages are 3.8 (Level 2) and 3.6 (Level 1).
More and less – What do they know about media?
An effort was made to ensure that the assessed questions were varied in terms of level of difficulty, types of tasks and issues covered (from media contents to institutions and players), thereby seeking to mirror the complexity of the field of media. Besides, topics traditionally more associated to information literacy, such as searching for information and bibliographic referencing,8 were are also covered given the particular interest the School Libraries Network has in the issue. Table 4 summarizes the general competences assessed in each dimension.
|
Table 4. General competences assessed and their main evaluating goals.

It is important to bear in mind that one question could mobilize more than one competence and that even within the same competence, there were different levels of difficulty. Not surprisingly, the questions with higher rates of correct answers were the ones related to simpler competences, to right or wrong answers, to the identification and distinction of different genres, media, actors and institutions. However, more complex skills, implying inferences, reasoning and knowledge of alternatives and broader contexts, scored much fewer points. Table 5 summarizes the competences evaluated in the five most and least scored questions, describing what was assessed.
|
Table 5. Most and least scored competences and their questions.

The five questions with the highest number of scored answers (always covering over 80% of the sample) were the same in all three groups. They all fall into the section on understanding, analysing and critically assessing communication media and, with one exception, have the lowest level of difficulty. These questions required students to identify and distinguish between the provided elements (such as opinion pieces or search engines) or adopt an attitude. The task which had the highest number of scored answers was on information literacy and addressed bibliographic referencing. Students were asked to select an answer from the provided options, which were quite straightforward. The question itself assesses students’ attitudes more than their competences. Nonetheless, 26% of the respondents state that when they write an assignment, they reference their bibliographic sources ‘sometimes, when I remember’ and 4% say ‘never, I do not think it is important’. Around 2% state ‘never, I did not know I was supposed to’ and for 1% the chosen answer is ‘no, because I do not know how to do it’.
As for the five questions with the highest number of unscored answers (invariably 90% or over), they are also practically the same for all three groups. Two questions fall into the section on analysis, understanding and assessment, and three questions belong to the section on production and participation, all of which have Level 2 and 3 difficulty. The poor scores obtained in the latter three are not surprising, since the various questions that focused on this type of practices indicated low levels of production and participation among the sample. The least successful of the five tasks falls into the section on analysis and interpretation. In this question, students viewed an extract of the film ‘007 Casino Royale’ and only 33 answers were scored. In total, 95% of the respondents did not get a score for their answers, which means that they were unable to detect a detail which impacts the whole production of the film: the fact that Sony owns one of the studios producing the motion picture, which was acknowledged in the opening credits and systematically placed its products throughout the film. The differences between the groups should be noted, however, with Level 1 group being the one with the highest percentage of unscored answers. This difference is statistically significant when compared with the two levels with higher scores.
Discussion and final remarks
This article presents part of a broader study which sought to respond to the call to the Media Literacy Expert Group (currently within the DG Connect – Directorate General for Communications Networks, Content and Technology) in 2014 to conduct pilot studies in each of the different member countries to assess media literacy levels. In Portugal, it was initially taken up by GILM – Grupo Informal sobre Literacia para os Media (Informal Group on Media Literacy) and subsequently by researchers at the Communication and Society Research Centre of the University of Minho, with the support of the now defunct Gabinete para os Meios de Comunicação Social (Office for the Media) and the School Libraries Network. The choice was made to focus the study on 12th-grade students, between the ages of 17 and 18, attending public schools in different parts of the country. Among other reasons for this choice is the interest in determining the competences these students have in the media sphere at a time when they are about to complete compulsory education and can either go on to higher education or join the job market.
Despite being a scoping study, the research does make an important contribution, particularly in terms of methodology, to studies aiming to assess media literacy competences. As was discussed in the first part of this article, not much research has been done in this field and there is still a long way to go until an appropriate and reliable method is devised that achieves the purpose and enables one to address and respond to the complexity of a dynamic and multidimensional phenomenon. If we take into account the different dimensions of media literacy, it is clear there is a substantial amount of research on access, uses and perceptions, but what is not nearly as common is research focusing on critical analysis and critical understanding competences, and on production/creation competences. Evidence of this difficulty can be found in the summary report on the pilot initiatives undertaken in various European countries, as explained above, at the behest of the European Commission, Media Literacy Unit (Celot, 2015). In this report, there is once again the misunderstanding about studying media practices and studying competences, with most of the initiatives focusing on access to and use of media and not on critical analysis, understanding and production skills.
As for the study being presented, we are aware that survey questionnaires, even when they generate quantifiable and scientifically valid results, cannot be confined to mere data collecting. A questionnaire is also the context in which it is administered: ‘a social interaction situation that entails a previous, and in most cases tacit, “communication contract”’ (Gonçalves, 2007: 203). The validity of this contract depends naturally on the cooperation of the interviewees, who are expected to agree to ‘(cor)respond to the questionnaire as it was presented to them’ (Gonçalves, 2007: 203). Responding to the questionnaire that was administered required a substantial cognitive effort and, consequently, a willingness to make it. Completing the questionnaire was not a leisurely or entertainment activity as is often the case with media use. The communication contract offered to the students was a particularly demanding one that did not necessarily match their interests or practices. Thus, to what extent was the questionnaire – long and arduous as it was – completed with full commitment? The potential implications arising from this question are by no means insignificant and underline the risks a possibly decontextualized assessment entails: the abstraction which derives from evoking memories, opinions and knowledge, as well as from carrying out simulations, does not equate to teenagers’ everyday practices taking place in complex contexts with a variety of elements relating and interacting with one another.
The results, which were admittedly low and below the researchers’ expectations, cannot be viewed merely from the standpoint of the students’ supposed knowledge. It is also important to focus on the questionnaire and the scale and their suitability as tools for measuring competences. Furthermore, one has to establish what, in this case, teenagers should know about the media and what they should be asked to ascertain their media literacy levels. What is lacking is a framework of reference, be it national or European, for media competences which different age groups should develop and that can serve as a basis for the assessment of those competences. Such a framework could draw on the existing guidelines in each country for integrating media literacy into the curriculum. In the case of Portugal, in 2014, and concurrently with the present study, the Media Education Guidance for Preschool Education, Basic Education and Secondary Education (Pereira et al., 2014) was drawn up and published by the Ministry of Education, which could be the basis for future research in this field.
This was, undoubtedly, one of the main problems identified by the study and which gave rise to some questions about the issues selected for the questionnaire. By way of example, one could refer the issue of the public service media (PSM) . Should this issue be in fact considered when assessing media literacy levels? Is it in fact an issue that students should have some knowledge of when they get to the 12th grade and are 17 or 18 years old? The sample in this study had very little knowledge of it: less than 2% managed to indicate one distinctive feature of the PSM, only 19% were able to name at least two media which are part of the vast universe that makes up the PSM and merely 17% indicated one specific source of funding. A higher percentage (38%) knew who the owner was, possibly because it is a subject which the media themselves more often cover.
The results, however, do merit attention for although there are misgivings about including certain issues such as the PSM, there are others, namely, those regarding the ability to search for information and reference sources or involvement in production and participation through the media, which are expected to be more familiar to students but nonetheless do not produce surprising scores. The teenagers always obtain higher scores on questions about perceptions than on those assessing competences – it might be the difference between what they think is the expected right answer and what they do in fact know. Hobbs (2017) also underlines that ‘the measurement of digital and media literacy has revealed important gaps between self-assessment (measured by self-report) and actual performance (measured by competency tasks)’ (p. 261). Furthermore, it was clear that accessibility and access to the media alone are not enough for there to be production. In a group that has a strong connection to the media, the production and participation practices, in addition to being rather simple, are not recurrent. The results seem then to suggest that throughout compulsory education, the vast majority of these students did not gain knowledge pertaining to the media that would enable them to analyse their messages or to understand their role in society, or were not given the opportunity to do so. If one also considers the fact that the majority of students who obtained the highest scores (Level 3) are part of families in which the parents have a higher education level and a skilled job, a feature which is less frequent among students in Level 2 and even more so among those in Level 1, it becomes clear how important it is to offer all students equal opportunities to develop media literacy competences to ensure that it does not become an issue merely for the elites and for the young people from families with greater access and ability to analyse information.
To conclude, one should mention the importance of assessing the development and implementation of media literacy with a view to raising its profile, upholding its social, civil and cultural importance and supporting the formulation of public policies that promote it. However, other than just assessing individual levels and factors, more attention can be given to assessing not only initiatives and projects seeking to promote and foster media literacy but also the impact they have on individuals. Taking up the question posed at the beginning of this study – what should teenagers of this age know about media? Although the answer is neither simple nor linear, we emphasize the importance of developing teenagers’ comprehension, analysis and production skills of media messages. To reflect on and analyse their own media consumption, habits are other fundamental competences they should develop. Through the promotion of these skills, teenagers should progressively build knowledge of different types of media, texts, genres and meanings; how messages and contents are produced and with what purposes; the techniques media use to communicate; media representations and stereotypes; and how the audiences are addressed and reached, how they receive, interact with and decode contents and how they use media and technologies for personal expression and communication. This Media Literacy work should have the development of critical competences as an umbrella and should stimulate ‘intellectual curiosity and the ability to ask “how” and “why” questions’ (Hobbs, 2017: 266). As Potter (2010) claims, ‘media literacy must be developed. No one is born media literate’ (p. 681).
Acknowledgements
This paper draws on a research project supported by the extinct Portuguese Office for the Media and by the School Libraries Network.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Notes
1.
The 2016 World Economic Report on the Future of Jobs (World Economic Forum, 2016) ranks the most important workplace skills for the next 5 years. Among these skills are, for example, critical thinking, creativity, emotional intelligence, negotiation, and judgement and decision making, which are some of the media literacy core skills.
2.
Hobbs and Frost (2003) worked precisely on assessing the impact of the existence of media education on formal education in a North American school, having used ‘paper and pencil measures’ (p. 352) which are very similar to the exercises of the tests to be discussed next.
3.
The School Libraries Network (Rede de Bibliotecas Escolares, Portuguese acronym: RBE), through the teacher-librarians of the participating schools, assisted in administering the questionnaires. A pretest was also carried out on four classes from two non-participating schools.
4.
These questions were selected for assessment from a total of 68 that made up the questionnaire. Initially, 28 were meant to be assessed, but 2 were excluded as it was found they were not worded clearly. The remaining questions that were not assessed were either filter questions or assessed frequency of access and use.
5.
The level of difficulty was determined both by the pretest and the discussions held between the authors of the study and the authorities that supported it. Subsequently, the level of difficulty was compared with the number of wrong/unscored answers to each question. It was concluded that the distribution was appropriate: the supposedly more difficult questions were, on the whole, empirically confirmed as such.
6.
The data derive from a scale with five items (always, many times, sometimes, rarely and never) which were subsequently converted into figures, ranging from 1 to 5. This conversion made it possible for non-parametric tests to be performed.
7.
In total, 141 students did not know how much was the net disposable in their households.
8.
Even though navigation competences, be they merely instrumental or eminently ethical, are traditionally more associated with information literacy, the complexity of the media underlined in Fastrez (2010) – subsequently developed in Roosen (2013) – results in them being included in media literacy.
ORCID iD
Sara Pereira
https://orcid.org/0000-0002-9978-3847
References
|
Amaral, I, Reis, B, Lopes, P. (2017) Práticas e consumos dos jovens portugueses em ambientes digitais [Practices and consumption of Portuguese youth in digital environments]. Estudos em Comunicação 24: 107–131. Available at: http://ojs.labcom-ifp.ubi.pt/index.php/ec/article/view/70. Google Scholar | |
|
Ashley, S, Maksl, A, Craft, S (2013) Developing a news media literacy scale. Journalism & Mass Communication Educator 68(1): 7–21. Google Scholar | SAGE Journals | |
|
Benavente, A, Rosa, A, da Costa, AF. (1996) A literacia em Portugal – Resultados de uma pesquisa extensiva e monográfica [Literacy in Portugal - Results from an extensive and monographic study]. Lisboa: Fundação Calouste Gulbenkian. Google Scholar | |
|
Buckingham, D (1998) Media education in the UK: Moving beyond protectionism. Journal of Communication 48(1): 33–43. Google Scholar | Crossref | ISI | |
|
Celot, P (2015) Assessing Media Literacy Levels and the European Commission Pilot Initiative. Brussels: EAVI. Google Scholar | |
|
Celot, P, Pérez Tornero, JM (coords) (2009) Study on assessment criteria for media literacy levels. Final report. Available at: http://ec.europa.eu/assets/eac/culture/library/studies/literacy-criteria-report_en.pdf. Google Scholar | |
|
Drexel, I (2003) Two lectures: The concept of competence – An instrument of social and political & change centrally coordinated decentralization – No problem? Lessons from the Italian case. Working paper 26. Available at: http://uni.no/media/manual_upload/60_N26-03-Drexel.pdf . Google Scholar | |
|
DTI and EAVI (2011) Testing and refining criteria to assess media literacy levels in Europe – Final report. Available at: http://www.umic.pt/images/stories/publicacoes4/final-report-ML-study-2011_en.pdf. Google Scholar | |
|
EU Kids Online (2014) EU Kids Online: Findings, methods, recommendations (deliverable D1.6). Available at: http://eprints.lse.ac.uk/60512/. Google Scholar | |
|
Fastrez, P (2010) Quelles compétences le concept de littératie médiatique englobe-t-il? [What are the competences encompassed by the concept of media literacy?]. Recherches en Communication 33: 35–52. Google Scholar | |
|
Ferrés Prats, J (2007) La competencia en comunicación audiovisual: dimensiones e indicadores [Audiovisual communication competence - dimensions and indicators]. Comunicar, XV(29): 100–107. Google Scholar | |
|
Ferrés Prats, J, García Matilla, A, Aguaded Gómez, JI. (2011) Competencia Mediática – Investigación sobre el grado de competencia de la ciudadanía en España [Media Competence - Research on the level of citizenship competence in Spain]. Barcelona: Instituto de Tecnologías Educativas, Consell de l’Audiovisual de Catalunya and Comunicar. Available at: http://rabida.uhu.es/dspace/bitstream/handle/10272/6876/Competencia_mediatica.pdf?sequence=2. Google Scholar | |
|
Gonçalves, A (2007) A difícil arte de perguntar: aporias e apostas da redacção do questionário para inquérito sociológico [The difficult art of asking questions: aporias and composition choices in sociological surveys]. Comunicação e Sociedade 12: 201–211. Google Scholar | Crossref | |
|
Guzmán Marín, F (2012) El concepto de competencias [The concept of competences]. Revista Iberoamericana de Educación 60(4): 1–13. Google Scholar | |
|
Hobbs, R (2017) Measuring the digital and media literacy competencies of children and teens. In: Blumberg, FC, Brooks, PJ (eds) Cognitive Development in Digital Contexts. London: Academic Press, pp. 253–274. Google Scholar | Crossref | |
|
Hobbs, R, Frost, R (2003) Measuring the acquisition of media-literacy skills. Reading Research Quarterly 38(3): 330–355. Google Scholar | Crossref | ISI | |
|
Livingstone, S, Papaioannou, T, Grandío Pérez, MM. (2012) Critical insights in European media literacy research and policy. Medijske Studije/Media Studies 3(6): 2–12. Google Scholar | |
|
Livingstone, S, van Couvering, E, Thumim, N (2008) Converging traditions of research on media and information literacies: Disciplinary, critical, and methodological issues. In: Coiro, J, Knobel, M, Lankshear, C (eds) Handbook of Research on New Literacies. New York: Routledge, pp. 103–132. Google Scholar | |
|
Lopes, P (2013) Literacia mediática e cidadania – Práticas e competências de adultos em formação na Grande Lisboa [Media literacy and citizenship – Practices and competences of adults in training in the Greater Lisbon area]. PhD thesis, ISCTE – Instituto Universitário de Lisboa, Lisboa. Available at: https://repositorio.iscte-iul.pt/bitstream/10071/8666/1/TESE_FINAL_Paula_Lopes_JURI.pdf. Google Scholar | |
|
Lopes, P, Pereira, S, Moura, P. (2015) Avaliação de competências de literacia mediática: o caso português [Evaluating media literacy competences: the Portuguese case]. Revista Observatório 1(2): 42–61. Google Scholar | Crossref | |
|
Moeller, S, Joseph, A, Lau, J. (2011) Towards media and information literacy indicators. In: Background document of the expert meeting, Bangkok, Thailand, 4–6 November, Paris: UNESCO. Available at: https://www.ifla.org/files/assets/information-literacy/publications/towards-media-and-Information-literacy-indicators.pdf. Google Scholar | |
|
OberCom (2014) A Internet em Portugal. Sociedade em rede 2014 [The Internet in Portugal. Network society 2014]. Available at: https://obercom.pt/wp-content/uploads/2016/06/A-Internet-em-Portugal-Sociedade-em-Rede-2014.pdf. Google Scholar | |
|
Ofcom (2016) Children and parents media use and attitudes report 2016. Available at: http://www.ofcom.org.uk/__data/assets/pdf_file/0034/93976/Children-Parents-Media-Use-Attitudes-Report-2016.pdf. Google Scholar | |
|
Pereira, S, Pereira, L, Melro, A (2015a) The Portuguese programme one laptop per child: Political, educational and social impact. In: Pereira, S (ed.) Digital Literacy, Technology and Social Inclusion: Making Sense of One-to-One Computer Programmes around the World. Vila Nova de Famalicão: Húmus, pp. 29–100. Google Scholar | |
|
Pereira, S, Pinto, M, Madureira, E (2014) Media Education Guidance for Preschool Education, Basic Education and Secondary Education. Lisbon: DGE/ME. Available at: http://www.dge.mec.pt/sites/default/files/ECidadania/Referenciais/media_education_guidance_dge_pt.pdf. Google Scholar | |
|
Pereira, S, Pinto, M, Moura, P (2015b) Níveis de Literacia Mediática: Estudo Exploratório com Jovens do 12º ano [Media Literacy Levels: Exploratory Study with 12th grade students].Braga: Centro de Estudos de Comunicação e Sociedade. Available at: http://www.lasics.uminho.pt/ojs/index.php/cecs_ebooks/issue/view/169. Google Scholar | |
|
Perrenoud, P (1995) Des savoirs aux compétences – De quoi parle-t-on en parlant de compétences? [From knowledge to competences – What are we talking about when discussing competences?]. Pédagogie Collégiale 9(1): 20–24. Google Scholar | |
|
Perrenoud, P (1999) Construire des compétences, est-ce tourner le dos aux savoirs? [Is building competences turning one’s backs to knowledges?]. Pédagogie collégiale 12(3): 14–17. Google Scholar | |
|
Perrenoud, P (2000) Pedagogia diferenciada: das intenções à ação [Differentiated pedagogy: from intentions to actions]. Porto Alegre, Brazil: Artmed. Google Scholar | |
|
Pinto, M, Pereira, S, Pereira, L. (2011) Educação para os Media em Portugal – Experiências, actores e contextos [Media Education in Portugal - Experiences, agents and context]. Lisboa: Entidade Reguladora Para a Comunicação Social. Google Scholar | |
|
Potter, WJ (2010) The state of media literacy. Journal of Broadcasting & Electronic Media 54(4): 675–696. Google Scholar | Crossref | ISI | |
|
Roosen, T (ed.) (2013) Les compétences en éducation aux médias [Competences in media education]. Bruxelles: Conseil Supérieur de l’Éducation aux Médias. Available at: http://www.educationauxmedias.eu/sites/default/files/files/CompetencesEducationMedias_Web.pdf. Google Scholar | |
|
Sultana, RG (2009) Competence and competence frameworks in career guidance: Complex and contested concepts. International Journal for Educational and Vocational Guidance 9: 15–30. Google Scholar | Crossref | ISI | |
|
Unicef (2017) The state of the world’s children 2017: Children in a digital world. Available at: https://www.unicef.org/publications/index_101992.html. Google Scholar | |
|
Wallis, R, Buckingham, D (2013) Arming the citizen-consumer: The invention of ‘media literacy’ within UK communications policy. European Journal of Communication 28(5): 527–540. Google Scholar | SAGE Journals | ISI | |
|
Williams, R (1983) Keywords: A vocabulary of culture and society. New York: Oxford University Press. Google Scholar | |
|
World Economic Forum (2016) The Future of Jobs Employment, Skills and Workforce Strategy for the Fourth Industrial Revolution. Geneva: World Economic Forum. Google Scholar |



