A Scientometric Evaluation of 50 Greek Science and Engineering University Departments using Google Scholar

Pitsolanti, Papadopoulou, and Tselios: A Scientometric Evaluation of 50 Greek Science and Engineering University Departments using Google Scholar



Scientometric evaluation and ranking of universities, departments and scholar constitutes a widely accepted topic which informs decisions both for students as well as for academics. In general, two main factors are subject to evaluation: The quality of the educational process offered by a specific institute or scholar and the quality and the quantity of the scientific outcome. The latter is largely based on scientific indices such as as the total number of publications, citations, h-index, i-10 index and others, such as m-index, which are subsequently calculated using the before mentioned indices.[1] The process of collecting such data is greatly facilitated by web-based scientific databases, namely Google Scholar, Web of Science and Scopus. In addition, there are other databases such as EconLib and PubMed, which focus on a specific scientific discipline.

One of the most popular indicators for scholars’ scientific quality evaluation is the h index.[1] The main advantage of the index is that it combines both scientific productivity and quality as measured by the impact of his work to the rest of the scientific community in terms of citations.[2] The h index is based on the distribution of citations of a specific scholar and is calculated as follows: a researcher has h-index equal to n if she has n publications with at least n references each and every other publication received less than n citations.[1]

A second advantage of h index is that it is quite easy to evaluate objectively and quantitatively a researcher, a process which is necessary for decisions related to professors’ hiring and promotion, research funding and for nominating awards such as the Nobel and Turing prize.[1] For instance, Hirsch[1] calculated the h-index of Nobel Prize winners and found that 84 per cent of them have exceeds 30. Moreover, newly elected members of the US National Academy of Sciences of physics and Astronomy of 2005 had an average h-index 46 at the time.

At the same time, by using the h-index the comparison between researchers is accomplished in a more complete manner.[1] Not suprisingly h-index is used many times as a measurement instrument of scientists, journal and departments of different disciplines.[3-10] Moreover, during university rankings of computer science departments, it was found that the rankings based on measurement reports for a scholar and their ratings according to the h index, showed a strong positive correlation.[11] Thus, nowadays it is widely accepted that the h-index offers a meaningful way to identify differences in scholars, departments or journals.[11,2,12]

Beyond the obvious advantages of h-index, there are some drawbacks.[12-13] The h index grows with new publications which can be cited, but it also grows with new references added to already existing articles. Thus, a researcher can increase her h-index, without having to publish for a time.[2] Another drawback is that by using h for measuring the quality of a younger scientist the results are not always representative. That is because a new researcher is not active for the same period of time, and time is required to receive citations in new articles. This is especially true in the Social sciences and Humanities, an article can take five to ten years to receive a significant number of citations.[14] Another shortcoming of the h-index is that it doesn’t take into account any attention to the number of writers who participate in a paper as well as the contribution of each one in this.[1] Thus, a researcher can increase the h index through collaborations where their role is not significant, apart from contributions where he is the main or the second author.

Apparently, such an approach could lead to superficial results since the publication practices as well the mean impact factor of each field vary significantly.[15] Differences in standard h values in the various fields, are mainly influenced by the average number of citations to a paper, the average number of publications produced by every scientist in the field, and the size (number of scientists) in the field.[5] Therefore, it would be better if the comparison is carried out strictly between researchers belonging to the same scientific field or at least normalize the results in order to be comparable. For instance, Batista et al. (2006) report that the ratio between the mean h indices for the scientific disciplines of biology and mathematics is 3:1. Scientists working in ‘smaller’ or marginal scientific areas will not achieve the same high h values, compared with those who work in extremely topical areas.[1]

Study objectives and questions

In general, this paper aims to highlight the positive contribution of scientometrics and its usefulness in issues related to higher education quality evaluation.[16] In specific, it attempts to provide answers to the following research questions:

  1. Are there any significant differences in publications, citations, h index and i10 index between departments of the same discipline which are located in different universities?

  2. Are there any differences on the academics’ scientific performance (as expressed using the h-index) between those who report detailed information about their research on the department’s website and those who don’t?

  3. Are there any differences on the scholars’ performance according to the location in which they completed their PhD (namely Greece, other European countries, or USA)?

  4. Is there any correlation between the academics’ rank (Full Professor, Associate Professor, Assistant Professor, Lecturer) and their h- index and total number of citations?

Correlation (correlation) and (a) h-index (b) citations/references

The rest of the paper is organized as follows. First, the research design is described in detail and the tools used to collect the data are discussed in brief. Subsequently, the results of the analysis are presented for each department and for each research question. Finally, the obtained findings are discussed and future goals are derived.


Research Design

Fifty departments from Science and Engineering disciplines were selected for the study (see Table 1). All in all, 31 Science and 19 Engineering departments were evaluated. The procedure proposed by Altanopoulou, Dontsidou and Tselios[4] was followed to record and analyze the data. The names, surnames and academic grade of all the academics were recorded. The program Publish or Perish (PoP) was used to calculate the total publications, citations, h-index, i-10 index, m-index. If a scholar used at the time Google Scholar Profile the related data were collected from there, instead of using PoP.

Table 1

Evaluated departments for each scientific field.

FieldNumber of evaluated departmentsDepartment’s nameNumber of academics of each department
Natural and Information sciences31Mathematics (6)
Statistics (2)
Physics (5)
Biology (5)
Chemistry (5)
Informatics (8)
Technological sciences19Civil engineering (5)
Chemical engineering (3)
Mechanical engineering (5)
Electrical and computer engineering (6)

Subsequently, for each department the following indices were calculated: the mean median and standard deviation on publications, citations, h-index, i-10 index and the mean m-index, the percentage of academic members who report information on their website and the percentage of academic members who retain a Google Scholar Profile. Subsequently, the aggregate results were calculated for each department, as well as for the departments of the same discipline. In some cases, synonymy could slightly affect the presented data, since there is always the possibility of having two scholars with the same name and surname. In such cases, a Google Scholar profile greatly assisted the procedure. If this was not the case the data were cleaned and the affiliation of each author was closely examined. However, it is difficult to claim 100% success while in the process of evaluating 1978 faculty members.[4]

As mentioned above, the Google Scholar database was the source to retrieve the scientometric data. In addition to free access offered by Google Scholar, there are still 3 advantages which characterize its use. Google Scholar is easy and straightforward to use. It is also quite efficient, because the search of information takes place immediately without needing additional registration steps to access the available data. Finally, the main advantage of Google Scholar is the wide coverage of scientific disciplines and publication venues which surpasses both Scopus and Web of Science.[17] The information related to the scientific activity of a specific scholar, covers not only notable and reputable scientific journals, but it also contains references from books or book chapters, conference proceedings and technical reports which are not indexed in Web of Science and Scopus databases.[12,14]

As far as the collection of data is concerned each of the databases follows a different approach, which in turn affects the results in total numbers of publications and reports.[18,17] For this reason, and in conjunction with the margin of error in the algorithm of Google Scholar, there is the possibility that the number of citations for a specific to is smaller (or even higher) than it appears. This can happen for a variety of reasons such as unrecognized text format, or error in recognition of the date of publication.[19] In the process of scientometric evaluation, various indicators are nowadays widely accepted and used such as the total number of publications, citations, h-index, i-10 index and others, such as m-index, which are calculated using the before mentioned indices.[1] The data for all faculty members and department were collected from 10 March 2015 to June 1st, 2015. The data were recorded and analyzed using Google Sheets and SPSS v.21 and are presented in the following section.

Presentation of the results

In this section, the aggregate evaluation results are presented by subject area. The following data are presented: the number of academics in a department; median and mean number or publications per faculty member (as well as standard deviation); median and mean number of citations per academic (and standard deviation); median and mean number of h-index (and standard deviation) mean and median number of i10-index (number of papers which have at least 10 citations each). Moreover, percentage of academics who report scientific activity on the departments’ web site and the percentage of scholars who retain Google Scholar Profile is presented. The departments were ranked according to their median h-index.

RQ1. Variation between Departments of the same Scientific Discipline

Departments of Science

In the departments of Mathematics (Table 1), although the department at the University of Crete precedes in terms of academics’ median h, the department at the University of Ioannina scores highest in mean (and median) number of publications, median number of citations and mean h-index (Table 1). The department of Athens has, by far, the most members. Among all departments, the departments at the University of Crete and Ioannina have the highest percentage of academics who report scientific activity and their publications in the departments’ website.

Table 1

Aggregate results of Departments of Mathematics.

UniversityNo.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.

Notes. No.: number of academics serving in each department; Res. Act.: percentage of academics who report scientific activity on the department’s website; , GS Pr= % of academics maintaining Google Scholar Profile; Publications: lifetime Google Scholar’s publications per academic (standard deviation and median); Citations: Citations per academic; Mean h: total of academics’ h-index subsequently divided by the total of academics; i10-index: number of papers which have at least 10 citations each, per academic; Figures in bold font indicate the highest value in each column.

One may notice significant differences in mean and median number of publications, citations and h-index between the departments. This demonstrates that departments in the same scientific subject, which have the same resources (for example, financial support from the Ministry of Education, comparable infrastructure and exactly the same wage for each academic according to their grades), have notable differences in research outcomes. However, no official national report states those differences. This is also evident in other scientific disciplines, as discussed below.

Table 2 presents the scientometric data for the two Greek departments of Statistics. First comes the department of Athens on all indices expect the median number of publications. Faculty members in both departments have the same percentage of academics who report scientific activity and their publications in the departments’ website (87.5%). However, the department of Athens shows best results at far maintenance of Google Scholar profiles is concerned. 

Table 2

Aggregate results of departments of Statistics.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.

Regarding the departments of Physics (Table 3) it appears that the department of Crete, has by far the highest scores in most evaluation indices of the scholars’ research work, with a significant difference from the other parts. A notable result is the great difference on the mean h index between the department of Crete and the department at the University of Patras which has been ranked last (25.5-14.2). The department of Ioannina has the lowest scores in terms of faculty members’ web site reported activity and Google Scholar Profile (13.7%). In the Department of Crete, while 100% of the faculty members report their scientific activity on the department’s web site, only 20% of them have a Google Scholar profile.

Table 3

Aggregate results of departments of Physics.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.
UniversityMeanS DMedianMeanS DMedianMeanS DMedianMeanS DMedianMean

Among the departments of Chemistry (Table 4), the department at the University of Crete is ranked first in all indices: h-index, publications, citations, i10-index, m index. The department of Athens has the lowest indexes (citations, i10-index, h-index), but 100% of its members maintain Google Scholar profile. The department at the University of Ioannina has the lowest mean number of publications.

Table 4

Aggregate results of departments of Chemistry.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.
UniversityMeanS DMedianMeanS DMedianMeanS DMedianMeanS DMedianMean

According to the above mentioned results (Table 5), in the departments of Biology, the department at the University of Crete surpasses all the others in all indices. On the other hand, as far as h-index is concerned all the other departments have a comparable median number. On the other hand, the newly established department at the University of Thrace has relatively small scores in publications, h-index, i10-index, percentage of scholars who report scientific activity on the Internet and who maintain a Google Scholar profile.

Table 5

Aggregate results of departments of Biology.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.

At Table 6, it appears that the Computer Science departments which show the highest scores in the evaluation indices are the department at the Universities of Athens and Thessaloniki. As far as mean h-index is concerned, third is ranked another department located in Athens but at the Athens University of Economics and Business (AUEB). On the other hand, the departments at the Harokopio and Ionian University have the lowest scores on publications, citations, h-index and, i10-index. In general, the online reporting rates are quite high for all departments, as well as the degree of adoption of GS profile among the scholars.

Table 6

Aggregate results of Computer Science departments.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.
UniversityMeanS DMedianMeanS DMedianMeanS DMedianMeanS DMedianMean

Departments of Engineering

Among the three departments of Chemical Engineering (see Table 7) the department at the University of Patras presents the highest numbers in most evaluation indexes in relation to the other two similar departments. It is also remarkable that the number of Academics at the Department of Athens is almost the double in comparison to the other two departments. However, the values of all indexes are quite lower than those at the other two departments.

Table 7

Aggregate results of Chemical Engineering departments.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.

Among the Civil Engineering Departments (Table 8), the Department at the University of Athens is ranked first in all 5 evaluation indexes, while the department at the Democritus University of Thrace scores the lowest numbers. The department at the University of Patras has the highest rates to members who report their research work at the department’s web site and maintain GS profile.

Table 8

Aggregate results of Civil Engineering departments.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.

From Table 9, it emerges that the department at the National Technical University of Athens has the highest values in Publications. As far as Citations and h-index are concerned, the departments at the NTUA, Crete and Thessaly present a quite similar performance. The department of Thrace seems to have the lowest numbers in almost all indices (except m-index) and in GS profile possession as well. The department at the university of Patras have the highest rates of GS profile use and personal webpages reporting academic activity among their members.

Table 9

Aggregate result of Electrical and Computer Engineering departments.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr.

The results in Table 10 show that the department of Mechanical Engineering at the University of Thessaly is the first in the ranking, while the department at the University of the Western Macedonia is the last. However, while the department located at the University of Western Macedonia shows low values at almost all the indices, has the highest m-index (indicating a department with relative young scholars). Moreover, despite the fact that the department at the University of Thessaloniki is ranked first in various indices such as h-index and i-10 index, has the lowest percentage of members who report scientific activity on the web site of the department. The departments at the universities of Patras and Thessaly have the highest rates of GS profile use and personal webpages reporting academic activity among their members.

Table 10

Aggregate results of Mechanical Engineering departments.

No.PublicationsCitationsh - indexi10-indexm-indexRes. Act.GS pr
W. Maced.1652.53350.5626584.7287.511.56.3915.313.780.881.3%68.8%

Concluding, the biggest differentiations between departments of the same discipline are presented in the departments of Sciences. In some cases, the difference in the h index between the first and the last Department exceeds 11 points, and the difference in the mean number of publications 100. The deviations in Engineering Departmetns are rather smaller (up to 6 h-index points and 82 publications, respectively).

RQ2. Differences between academics who report detailed information about their research on the department’s website and those who don’t

Further analysis was conducted to examine possible link between scientific output and scholars’ who report academic activity. The purpose was to determine whether there was any difference in the level of their research, among those who report their scientific activity and those who do not. In Table 11 the data collected for the departments of each academic discipline are presented.

Table 11

Differences between academics who report detailed information about their research on the department’s website and those who don’t (* indicates statistical significance at the .005 level).

Computer Science0,0520,049*0,075
Chemical Engineering<0,001*<0,001*<0,001*
Civil Engineering<0,001*<0,001*<0,001*
Electrical and Computer Engineering0,014*0,015*0,003*
Mechanical Engineering0,1290,0820,055

A statistically significant difference emerged in three evaluation indices, publications, citations and h-index (Mann Whitney U). The results showed that in 50% of the examined departments (25/50), there was a statistically significant difference in publications, h-index and citations among those who were reporting their scientific activity on the department’s web site and those who did not. Also, in 24% of the departments (Biological, Statistical, and mechanical engineers) the assessment indicators were not different between those who reported scientific activity on the department’s web site and those who did not. On the other hand, in the departments of Chemistry statistically significant difference existed only in publications index and in sections of it there was a statistically significant difference only in citations index. It should be noted that the differences were of statistical significance in all indices in the departments of Mathematics, Physics, Civil Engineering, Chemical Engineering, Electrical and Computer Engineering.

RQ3. Differences between academics due to the location in which they received their PhD (Greece, Europe, USA)

Differentiations to the research performance of faculty members, according to the source of their PhD were examined. To this end, the scholars were into the following three categories: Scholars who obtained their PhD from Greece (GR, N=1138), Europe (EU, N=410) and United States of America (USA, N=318), respectively. An Internet search was conducted to find the origin of the PhD. The national PhD theses archive was as an additional source of information (http://www.didaktorika.gr/eadd). After a thorough investigation, it has been possible to collect information for the majority of faculty members under consideration, namely 1875. The difficulties at this point, had to do (a) with the lack of detailed CV, (b) with non-inclusion of a doctoral dissertation at the national archive of PhD theses, or (c) lack of response to personal communication via email. Thus, for 103/1978 academics the region in which they earned their PhD was not identified and in turn they were excluded from the study. Other 9 scholars completed their doctorate studies in other countries (Russia 2, Israel 1, Japan 1, South Africa 1, Australia 2, Hong Kong 2 and they were also excluded from the sample.

Analysis of the data showed that in general, there were significant deviations in all indices, h-index (by using the non-parametric Kruskal-Wallis H test, χ2=69.045, p<.001), publications (Kruskal-Wallis H test, χ2= 56.651, p<.001) and citations (Kruskal-Wallis H test, χ2= 81.143, p<.001), depending on the scholars’ doctorate source. In addition, further analysis for each pair was conducted (i.e. GR-EU, GR-US, EU-US). The analysis showed that for the pairs GR-US and EU-US there were statistically significant differences in all indices (in favor of US Mann-Whitney U, h-index GR-US: U= 131088, p<.001, publications GR-US: U= 136303.5, p<.001, citations GR-US: U= 126538, p<.001, h-index EU-US: U= 52446, p<.001, publications EU-US:U= 126538, p<.001, citations EU-US: U= 50790.5, p<.001). For the pair GR-EU a statistically significant difference emerged only in the number of publications (in favor of the EU: Mann-Whitney U h-index GR-EU: U= 233948.5, p= .089>.05, publications U=228655.5, p= .019<.05, citations GR-EU: U= 232436.5, p= .059 >.05).

In specific, in 7/10 of departments’ categories a statistically significant difference in all indicators of assessment emerged depending on the region where PhD was obtained. As mentioned above, further analysis for each pair was conducted (i.e. GR-EU, GR-US, EU-US). It seemed that differences were evident in the pair of GR-US, in which 76% (38/50) of the departments statistically significant differences were detected in indicators in favor of scholars who earned their PhD in the US. Less significant differentiations were evident in the other pairs: As far as the GR-EU pair is concerned, in only 12/50 of the departments were found significant differences in all indicators in favor of the scholars who obtained their PhD in a European country other than Greece. In the 40% of the examined departments a statistically significant difference was found at least to one index (publications, citations or h-index). As for the EU-USA pair, in only 20% of the examined departments a statistically significant difference at least in one indication was found.

RQ4. Correlation between academic rank, h-index and number of citations

The last research question was related to the investigation of the link between the members’ academic rank and their h-index and number of citations. In Table 12 the data obtained for the departments of each academic discipline are presented. In general, higher correlation suggests better hiring practices, since a higher scientific production is required to achieve higher academic ranks.

Table 12

Correlations between academic rank h-index and department (* indicates statistical significance at the .005 level).

DepartmentsAcademic rank- h-index correlationAcademic rank- citations correlation
Mechanical Engineering0.60*0.48*
Chemical Engineering0.50*0.30*
Computer Science0.42*0.37*
Civil Engineering0.39*0.27*
Electrical and Computer Engineering0.35*0.30*

The results showed that a statistically significant correlation between members’ academic rank and h-index in 54% (27/50) of the departments. The values of the correlation coefficient Spearman’s r ranged from .00 (lack of relationship between rank and h-index, Statistics Department of Statistics Piraeus) to .82 (very strong correlation between rank and h index, Department of Computer Science, University of Thessaly). In addition, in 28% (14/50) of the examined departments a significant correlation between members’ academic rank and citations was found.


The purpose of this study was to evaluate faculty members’ research performance in departments of Sciences and Engineering in Greece using scientometric indices. Using the Internet and the citation database Google Scholar as well Publish or Perish software,[14] (PoP,), indices such as publications (publications), references (citations), h-index, i10-index and m-index were collected. The process was quite efficient and accurate. In the majority of the evaluated departments, a significant difference in h-index was observed between academics who report scientific activity on the departments’ website and those who do not. Moreover, academics who earned their PhD title in the USA have higher indices in comparison to scholars who obtained their PhD title in Europe or in Greece. Finally, the correlation between the academic rank and the scholars’ h-index (or the number of their citations) in some cases is quite low in some departments which, in some cases, could indicate lack of meritocracy.

From the before mentioned discussion, it arises that one of the ways to assess the quality of Greek universities, is by focusing on the research output of faculty members. From the results obtained some useful conclusions were derived, which could contribute to the improvement of the departments. Moreover, such studies could inform elected officials and policy makers and better shape the public opinion. For instance, it would be advisable for potential university students to choose departments based on the reported level of research and not on criteria such as distance from their place of residence. Apparently, in many cases other parameters, such as socio-economic status of each family, are involved. Moreover, rankings based on quantitative and widely accepted criteria would help to shape incentives for further research by all the Institutions. Appropriate interventions and policies could also aid the Universities to reach satisfactory scientific output.

The implementation of bibliometric evaluation in Greek Universities on an annual basis, will enable them to self-monitor the progress of the scientific output and the degree to which each University qualifies and meets tangible objectives laid down by the Greek Ministry of Education. Research for faculty members of academic departments, is one of their basic obligations as faculty members (apart from teaching and administrative tasks) and one could classify them into scientifically active or inactive. In this way, with the appropriate reform impetus an improvement of the current state of the Greek educational system could occur. More specifically, the evaluation indicators used in this work (h-index, m-index, publications, citations, i10-index) could be taken into account, with new legislation, to further incorporate transparent and merit practices into Tertiary Education. For instance, in some U.S. universities and disciplines science teachers are requested to have an index at least 12 to be promoted to the rank of associate professor and h equals 18 or higher enables their promotion to the rank of full professor (Lazaridis, 2008, p. 75). Similar policies in our country will provide greater research incentives to the faculty members to further improve the quality of their research work. In addition, transparent hiring practices based on tangible criteria adopted by hiring committees could motivate young people to pursue careers in academia.

However, the study is not without limitations. Our data are reliant on websites to determine the academic ranks of scholars. Therefore, we may have been inaccurate in assigning academic ranks to some of the academics in the present study. Also in some cases a synonymity occurred thus inflating the scholar’s indices. In general, in only a few cases such a problem was occurred, since a lot of academics retain Google Scholar Profile. The latter is considered as accurate, which in turn in some cases could not be the case.

It is quite evident that the research questions answered in this paper, cover only a fraction of the possibilities provided by the bibliometric evaluation and statistical analysis of research data of the faculty. Numerous research questions and evaluation indicators of academic performance can be thoroughly studied.[12] Our research was mainly focused to indices such as h-index, publications and citations. In addition, questions related to the area in which the scholars’ PhD is obtained and possible relation to their output, as well as tendencies such as maintenance of GS profile and reporting scientific activity on the departments’ web site, were also investigated. Other issues such as academic inbreeding.[20] relation between state funding and performance, number and characteristics of doctorate students and their scientific output as well as relation between scholars’ gender and salaries and performance could be closely monitored and explored. A useful extension of this work is to incorporate representative international departments in order to better monitor the scientific progress and examine whether it is calibrated to an international level.

Finally, in order to publicly provide more reliable and representative results data should be collected by all Greek Universities and departments, preferably by using a suitable, usable and accessible web application.[21-22] Thus, the data related to scientific output could be instantly available to any stakeholder without the need to further process them and gives the ability to inform decisions. Therefore, a more organized, comprehensive and official effort to evaluate all the departments of the universities in the country, could greatly assist improvement of Greek Tertiary Education.



Google Scholar


Publish or Perish


The authors declare no conflict of interest.



Hirsch JE , author. An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences. 2005;102(46):16569–72


Glänzel W , author. On the opportunities and limitations of the H-index. Science focus. 2006


Abramo G, Cicero T, D’Angelo CA , authors. A sensitivity analysis of researchers’ productivity rankings to the time of citation observation. Journal of Informetrics. 2012;6(2):192–201


Altanopoulou P, Dontsidou M, Tselios N , authors. Evaluation of 93 major Greek University Departments using Google Scholar. Quality in Higher Education. 2012;18(1):111–37


Egghe L , author. The Hirsch index and related impact measures. Annual review of information science and technology. 2010;44(1):65–114


Lazaridis T , author. Ranking university departments using the mean h-index. Scientometrics. 2010;82(2):211–6


Kazakis NA , author. Bibliometric evaluation of the research performance of the Greek civil engineering departments in National and European context. Scientometrics. 2014;101(1):505–25


Kazakis NA, Diamantidis AD, Fragidis LL, Lazarides MK , authors. Evaluating the research performance of the Greek medical schools using bibliometrics. Scientometrics. 2014;98(2):1367–84


Shin JC, Toutkoushian RK, Teichler U , editors. University rankings: Theoretical basis, methodology and impacts on global higher education (Volume 3). Springer Science and Business Media; 2011. p. 3


Glänzel W, Schubert A, Thijs B, Debackere K , authors. A priori vs. A posteriori normalisation of citation indicators. The case of journal ranking. Scientometrics. 2011;87(2):415–24


Cronin B, Meho L , authors. Using the h-index to rank influential information scientists’. Journal of the American Society for Information Science and Technology. 2006;57(9):1275–8


Mingers J, Leydesdorff L , authors. A review of theory and practice in scientometrics. European Journal of Operational Research. 2015;246(1):1–19


Moed HF , author. Citation analysis in research evaluation (Vol. 9). Springer Science and Business Media. 2006. p. 6


Harzing AW , author. The publish or perish book. Melbourne: Tarma software research; 2010


Costas R, Bordons M , authors. The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level. Journal of Informetrics. 2007;1(3):193–203


Marchant T, Bouyssou D , authors. Ranking scientists and departments in a consistent manner. Journal of the American Society for Information Science and Technology. 2011;62(9):1761–9


Bar-Ilan J , author. Which h-index? A comparison of WoS, Scopus and Google Scholar. Scientometrics. 2008;74(2):257–71


Bar-Ilan J , author. Informetrics at the beginning of the 21st century: A review. Journal of Informetrics. 2008;2(1):1–52


Jacso P , author. Deflated, inflated, and phantom citation counts. Online Information Review. 2006;30(3):297–309


Inanc O, Tuncer O , authors. The effect of academic inbreeding on scientific effectiveness. Scientometrics. 2011;88(3):885–98


Katsanos C, Tselios N, Tsakoumis A, Avouris N , authors. Learning about web accessibility: A project based tool-mediated approach. Education and Information Technologies. 2012;17(1):79–94


Orfanou K, Tselios N, Katsanos C , authors. Perceived usability evaluation of learning management systems: Empirical evaluation of the system usability scale. The International Review of Research in Open and Distance Learning (IRRODL). 2015;16(2):227–46


Kotsiantis S, Tselios N, Xenos M , authors. Students’ evaluation of Tutors in distance education: A Quasi-longitudinal study. International Journal of Learning Technology. 2016;12(1):26–41


Tselios N, Altanopoulou P , authors. Evaluation of Greek education university departments using Google Scholar’s h index. In proceedings of the 2nd Panhellenic conference, Integration and use of ICT in education Patras, Greece: 2011 p. 867–76. April. Greek.