0

Keep calm changes are coming

The team here at the UOC Library are working on new blog and will no longer be updating LibTechNotes. We would like to thank you all for following our blog over these years and we will let you know about the Library’s new blog so you can keep track of our activities in a new updated and enriched format.

1

Some resources about Open Education

In this post I would like to let you know about some interesting information, that came to my hands recently, about online learning and open educational resources.

On the one hand the OER Research Hub, which provides a focus for research about the impact of OER on learning and the particular influence of openness, has released the third and final report focusing on educators’ use of OER: Data Report 2013-2015: Educators. This report completes the previous ‘OER Evidence Report 2013-2014’ , that brought together a range of evidence around the research hypotheses of the product and provides an overview of the impact OER is having on a range of teaching and learning practices.

edus

In 2013 the William and Flora Hewlett Foundation funded OER Research Hub Project, in order to provide a focus for a research designed to give answers to the overall question ‘What is the impact of OER on learning and teaching practices?’ and to identify the particular influence of openness.

On the other hand, there is a resource that presents a broad view of today’s online learning options with the goal of serving prospective online students: The Online Learning Guidebook. This guide, created by AccreditedSchoolsOnline.org, analyzes the current state of online education, where it stands and where it’s heading. We’ll take an in-depth look at the benefits and experience of an online education; choosing an online school or program; the methods, technologies and resources employed.

Captura

 

3

The most popular tools for research

An interesting survey produced by the Innovation in Scholarly Communication group at Utrecht University shows the main tools used by researchers.

The study organizes researchers’ most common activities into six areas: discovery, analysis, writing, publication, outreach and assessment. A series of questions have been set based on these areas asking researchers to indicate what tools they usually use.

The study has been popular and has already analysed responses from 1,000 researchers, but there is still the chance for other institutions to join and include their researchers (for institutions & societies).

The preliminary results only reiterate what might be expected, but below are some of the aspects that drew our attention.

  • 92% of researchers use Google Scholar to find information.
  • 82% of researchers use some sort of tool to keep up-to-date with their field.
  • 85% of researchers prefer reading PDF files.
  • Excel is the main tool used to process data (77% of responses).
  • Only 52% of researchers use some kind of tool to share/archive code and data.
  • 77% of researchers have a strategy for disseminating their research beyond the usual channels of academia.

The image below is a graphical summary of the current situation with regard to the tools used by researchers: traditional, modern, innovative and experimental.

Tools

Finally, it is worth highlighting that the questionnaire is both intuitive and attractively designed. It shows the latest trends and tools being used by researchers. The study is also open to librarians, who can indicate the tools they recommend to their users.

 

The 44th Liber Annual Conference took place between 24 and 26 June at Senate House in London. Although the conference focused on the topic of “Toward Open Science”, I would like to highlight some of the comments and conclusions from the Libraries and research data: Towards a new leadership role workshop I had the opportunity to attend and those that came up on Twitter.

For some time now scientometricians, social scientists and research administrators have been questioning the reliability of scientific evaluation models based solely on quantitative analysis. Indeed, the San Francisco Declaration on Research Assessment was published back in 2013 and explicitly came out against the use of the impact factor as a tool in the evaluation processes for scientific activity.

Leiden1

[Illustration by David Parkins]

More recently, on 22 April, the journal Nature published the Leiden Manifesto, which offers ten best practices for metrics-based evaluation of research. In fact, in other words, the manifesto lists 10 limitations to evaluation that is merely quantitative.

Its ten principles are not news to scientometricians, although none of us would be able to recite them in their entirety because codification has been lacking until now.

Data on scientific activities are increasingly being used for the governance of science. The problem is that evaluation has gone from being based on the views of experts to metrics alone, and web platforms, new bibliometric indicators and institutional rankings have proliferated. The people behind this manifesto believe that there is a risk of damaging the scientific system through use of the very instruments designed to make it better.

The manifesto stresses that data taken from bibliometric indicators must not replace informed judgement and, thus, that assessors must again be given the ability to evaluate in these terms and the responsibility to make decisions.

The best decisions are taken by combining robust statistics with sensitivity to the aim and nature of the research that is evaluated. Both quantitative and qualitative evidence are needed.

The 10 principles can be summarized as follows:

1.- Quantitative indicators cannot replace the judgement of expert assessors, but they can be used to help support them.

2.- Evaluation of research activity has to adapt to the mission and objectives of the institution, individual or group being evaluated.

3.- Indicators need to be developed that reflect the impact of research activities locally and regionally, and those that are developed in languages other than English.

4.- The data collection and analysis processes have to be open, transparent and simple.

5.- Those evaluated have to be able to verify the analysis of the indicators being used for the evaluation and, if they disagree, request re-evaluation.

6.- The differences existing in terms of impact in different fields of research have to be taken into account when producing indicators.

7.- Individual evaluation of researchers has to be based on qualitative assessment of their portfolio. Indicators cannot be used without taking into account the researcher’s context.

8.- False precision and misplaced concreteness must be avoided.

9.- The effects of certain indicators as incentives for certain activities and disincentives for others must be taken into account.

10.- The indicators have to be reviewed and updated regularly.

 

Bibliography

Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429–431. doi:10.1038/520429a

American Society for Cell Biology, “San Francisco declaration on research assessment,” http://am.ascb.org/dora/ (accessed January 31, 2014).

Page 1 of 4312345...102030...Last »