"It''s become one of the hot courses on campus. Enrollment is up, one of the accounting lecturers has twice been named professor of the year, and several dozen students spent their summer mornings in a class poring over a 3-inch-thick tome titled "Federal Taxation." Part of the answer to accounting''s new popularity may be the inherent romance of business. Then there''s this fact: Even in a downish economy, accounting students are finding jobs -- jobs that just might be the first step toward running their own companies. Enrollment in accounting classes is up 19% since 2004, according to a survey by the American Institute of Certified Public Accountants."
Thursday, August 14, 2008
Wednesday, August 13, 2008
lies, lies and Impact Factors?
The Open Access journal Ethics in Science and Environmental Politics (ESEP) have devoted entire first issues of 2008 to the question of Impact Factors with their theme:
The use and misuse of bibliometric indices in evaluating scholarly performance
Here is the editors' foreword:
Bibliometric indices (based mainly upon citation counts), such as the h-index and the journal impact factor, are heavily relied upon in such assessments. There is a growing consensus, and a deep concern, that these indices — more-and-more often used as a replacement for the informed judgement of peers — are misunderstood and are, therefore, often misinterpreted and misused. The articles in this ESEP Theme Section present a range of perspectives on these issues. Alternative approaches, tools and metrics that will hopefully lead to a more balanced role for these instruments are presented.
Here is the TOC:
INTRODUCTION:
Factors and indices are one thing, deciding who is scholarly, why they are scholarly, and the relative value of their scholarship is something else entirely/ Browman HI, Stergiou KI
Escape from the impact factor/ Campbell P
Makes the argument that the most effective and fair analysis of a person’s contribution derives from a direct assessment of individual papers, regardless of where they were published.
Lost in publication: how measurement harms science/ Lawrence PA
Changes to the way scientists are assessed are urgently needed, and suggestions are made in this article.
Hidden dangers of a ‘citation culture’/ Todd PA, Ladle RJ
A look at one of the areas where citation numbers can be inaccurate in terms of the manner in which papers are cited, indexed and searched for.
The siege of science/ Taylor M, Perakakis P, Trachana V
The academic as author, editor and/or reviewer, under intense competitive pressure, is forced to play the publishing game where such numbers rule, leading to frequent abuses of power. Here, Perakakis and Trachana review the current status of this siege, how it arose and how it is likely to evolve.
The economics of post-doc publishing/ Cheung WWL
This article explores how bibliometrics affect the publication strategy from a point of view of a post-doctoral fellow, with analogy and explanation from simple economic theories.
Chasing after the high impact./ Tsikliras AC
Although young scientists may not always be aware of the advantages and pitfalls of the impact factor system when it comes to the choice of which journal to submit to, journal ranking is among the selection criteria, following the journal’s general scope and rapid manuscript handling but preceding choice of a journal which allows authors to suggest potential referees, and open access journals.
Challenges for scientometric indicators: data demining, knowledge flows measurements and diversity issues/ Zitt M, Bassecoulard E
This article describes some of the challenges for bibliometric indicators (data ‘demining’,knowledge-flow measurements and diversity issues) underlying, among other applications, reliable evaluation procedures.
Google Scholar as a new source for citation analysis/ Harzing AWK, van der Wal R
Traditionally, the most commonly used source of bibliometric data is Thomson ISI Web of Knowledge, in particular the Web of Science and the Journal Citation Reports (JCR), which provide the yearly Journal Impact Factors (JIF). This paper presents an alternative source of data as well as 3 alternatives to the JIF to assess journal impact.
Re-interpretation of ‘influence weight’ as a citation-based Index of New Knowledge (INK)/Pauly D, Stergiou KI
The INK method for assessing the ‘influence weight’ of journals is re-interpreted as a potential citation-based indicator of the impact of scientific and other publications.
Benefitting from bibliometry/ Giske J
Within a department, the evaluation of performance may be one of several incentives for improving scientific quality and productivity. However, used alone, performance evaluation can lead to destructive competition and marginalization of potentially valuable staff members.
Using a balanced approach to bibliometrics: quantitative performance measures in the Australian Research Quality Framework/ Butler L Erratum
Within a department, the evaluation of performance may be one of several incentives for improving scientific quality and productivity. However, used alone, performance evaluation can lead to destructive competition and marginalization of potentially valuable staff members.
Citation counts for research evaluation: standards of good practice for analyzing bibliometric data and presenting and interpreting results/ Bornmann L, Mutz R, Neuhaus C, Daniel HD
Here Bornmann et al present standards of good practice for analyzing bibliometric data and presenting and interpreting the results.
Validating research performance metrics against peer rankings / Harnad S
In and of themselves, metrics are circular: They need to be jointly tested and validated against what it is that they purport to measure and predict, with each metric weighted according to its contribution to their joint predictive power.
The use and misuse of bibliometric indices in evaluating scholarly performance
Here is the editors' foreword:
Bibliometric indices (based mainly upon citation counts), such as the h-index and the journal impact factor, are heavily relied upon in such assessments. There is a growing consensus, and a deep concern, that these indices — more-and-more often used as a replacement for the informed judgement of peers — are misunderstood and are, therefore, often misinterpreted and misused. The articles in this ESEP Theme Section present a range of perspectives on these issues. Alternative approaches, tools and metrics that will hopefully lead to a more balanced role for these instruments are presented.
Here is the TOC:
INTRODUCTION:
Factors and indices are one thing, deciding who is scholarly, why they are scholarly, and the relative value of their scholarship is something else entirely/ Browman HI, Stergiou KI
Escape from the impact factor/ Campbell P
Makes the argument that the most effective and fair analysis of a person’s contribution derives from a direct assessment of individual papers, regardless of where they were published.
Lost in publication: how measurement harms science/ Lawrence PA
Changes to the way scientists are assessed are urgently needed, and suggestions are made in this article.
Hidden dangers of a ‘citation culture’/ Todd PA, Ladle RJ
A look at one of the areas where citation numbers can be inaccurate in terms of the manner in which papers are cited, indexed and searched for.
The siege of science/ Taylor M, Perakakis P, Trachana V
The academic as author, editor and/or reviewer, under intense competitive pressure, is forced to play the publishing game where such numbers rule, leading to frequent abuses of power. Here, Perakakis and Trachana review the current status of this siege, how it arose and how it is likely to evolve.
The economics of post-doc publishing/ Cheung WWL
This article explores how bibliometrics affect the publication strategy from a point of view of a post-doctoral fellow, with analogy and explanation from simple economic theories.
Chasing after the high impact./ Tsikliras AC
Although young scientists may not always be aware of the advantages and pitfalls of the impact factor system when it comes to the choice of which journal to submit to, journal ranking is among the selection criteria, following the journal’s general scope and rapid manuscript handling but preceding choice of a journal which allows authors to suggest potential referees, and open access journals.
Challenges for scientometric indicators: data demining, knowledge flows measurements and diversity issues/ Zitt M, Bassecoulard E
This article describes some of the challenges for bibliometric indicators (data ‘demining’,knowledge-flow measurements and diversity issues) underlying, among other applications, reliable evaluation procedures.
Google Scholar as a new source for citation analysis/ Harzing AWK, van der Wal R
Traditionally, the most commonly used source of bibliometric data is Thomson ISI Web of Knowledge, in particular the Web of Science and the Journal Citation Reports (JCR), which provide the yearly Journal Impact Factors (JIF). This paper presents an alternative source of data as well as 3 alternatives to the JIF to assess journal impact.
Re-interpretation of ‘influence weight’ as a citation-based Index of New Knowledge (INK)/Pauly D, Stergiou KI
The INK method for assessing the ‘influence weight’ of journals is re-interpreted as a potential citation-based indicator of the impact of scientific and other publications.
Benefitting from bibliometry/ Giske J
Within a department, the evaluation of performance may be one of several incentives for improving scientific quality and productivity. However, used alone, performance evaluation can lead to destructive competition and marginalization of potentially valuable staff members.
Using a balanced approach to bibliometrics: quantitative performance measures in the Australian Research Quality Framework/ Butler L Erratum
Within a department, the evaluation of performance may be one of several incentives for improving scientific quality and productivity. However, used alone, performance evaluation can lead to destructive competition and marginalization of potentially valuable staff members.
Citation counts for research evaluation: standards of good practice for analyzing bibliometric data and presenting and interpreting results/ Bornmann L, Mutz R, Neuhaus C, Daniel HD
Here Bornmann et al present standards of good practice for analyzing bibliometric data and presenting and interpreting the results.
Validating research performance metrics against peer rankings / Harnad S
In and of themselves, metrics are circular: They need to be jointly tested and validated against what it is that they purport to measure and predict, with each metric weighted according to its contribution to their joint predictive power.
Monday, August 11, 2008
Top universities' ranking on the web - UJ not in Top 10
Webometrics has just released their 2008 World Universities's ranking on the web , and UJ is listed as the 14th Top African University , and 3,704th in the world.
Objectives of the Webometrics Ranking of World's Universities
"The original aim of the Ranking was to promote Web publication, not to rank institutions. Supporting Open Access initiatives, electronic access to scientific publications and to other academic material are our primary targets. However web indicators are very useful for ranking purposes too as they are not based on number of visits or page design but global performance and visibility of the universities.As other rankings focused only on a few relevant aspects, specially research results, web indicators based ranking reflects better the whole picture, as many other activities of professors and researchers are showed by their web presence.
The Web covers not only only formal (e-journals, repositories) but also informal scholarly communication. Web publication is cheaper, maintaining the high standards of quality of peer review processes. It could also reach much larger potential audiences, offering access to scientific knowledge to researchers and institutions located in developing countries and also to third parties (economic, industrial, political or cultural stakeholders) in their own community."
The top ten consisted of:
(the number is brackets is the university's world ranking)
1 University of Cape Town (385)
2 Stellenbosch University (654)
3 Rhodes University (722)
4 University of Pretoria (734)
5 University of the Witwatersrand (831)
6 University of the Western Cape (1,218)
7 University of Kwazulu Natal (1,313)
8 University of South Africa (1,499)
9 American University in Cairo (1,654)
10 Nelson Mandela Metropolitan University (2,145)
Objectives of the Webometrics Ranking of World's Universities
"The original aim of the Ranking was to promote Web publication, not to rank institutions. Supporting Open Access initiatives, electronic access to scientific publications and to other academic material are our primary targets. However web indicators are very useful for ranking purposes too as they are not based on number of visits or page design but global performance and visibility of the universities.As other rankings focused only on a few relevant aspects, specially research results, web indicators based ranking reflects better the whole picture, as many other activities of professors and researchers are showed by their web presence.
The Web covers not only only formal (e-journals, repositories) but also informal scholarly communication. Web publication is cheaper, maintaining the high standards of quality of peer review processes. It could also reach much larger potential audiences, offering access to scientific knowledge to researchers and institutions located in developing countries and also to third parties (economic, industrial, political or cultural stakeholders) in their own community."
The top ten consisted of:
(the number is brackets is the university's world ranking)
1 University of Cape Town (385)
2 Stellenbosch University (654)
3 Rhodes University (722)
4 University of Pretoria (734)
5 University of the Witwatersrand (831)
6 University of the Western Cape (1,218)
7 University of Kwazulu Natal (1,313)
8 University of South Africa (1,499)
9 American University in Cairo (1,654)
10 Nelson Mandela Metropolitan University (2,145)
University news from the West
SIDESTEPPING THE CRANKS Professors are often magnets for crackpots bearing pet theories and searching for validation.
PLEASE DON'T GO Administrators at public universities are devising new strategies to keep key faculty members in an era of increased poaching.
THOSE WHO DO, TEACH Harold and Kumar's Kal Penn at Penn prompts Mark Oppenheimer to ask: Do celebrity profs energize or cheapen a university?
AUTHOR, AUTHOR? Two Case Western Reserve University professors say Routledge recycled their work without credit or royalty.
'SCORES DESCRIBE BUT DON'T EXPLAIN' When used without regard for their complexities, Daniel Koretz says, the results of standardized tests can be misleading.
CHINA: Olympics - low-key involvement by universities Michael Delaney University sports are a big deal in China, followed with great fervour by students and alumni, and many universities boast excellent sporting facilities and stadiums. Yet historically there has been a great distance, even antipathy, between the state administration and university sports departments. As a result, the nation's centralised sports system means universities have largely been left out in the cold when it comes to preparing athletes for the Olympic Games.
GLOBAL: Liberalisation shelved as talks collapse Keith Nuthall Proposals to sweep away some restrictions preventing private universities and higher education service providers from teaching, researching and examining in foreign countries have been put on ice at the World Trade Organization.
GERMANY: Plans to create more leeway for research Michael Gardner Germany's federal government has adopted a five-point plan to create more autonomy for public-funded research institutions. In future, they will enjoy considerably more scope in terms of budgets, staff, networking, construction measures and procurement. The new measures will ultimately lead to a special law on academic freedom agreed to by the government last year.
SPECIAL REPORT: E-Learning
E-learning is one of the buzzwords of 21st century higher education, with academics around the world increasingly relying on technology to communicate with their students – and transmit their lectures. But as University World News writers report in this special on E-learning, as some designate it, although a great boon to many lacking easy access to education, technology must be used intelligently as a tool for learning and not be regarded simply as a panacea.
SOUTH AFRICA: Universities not far behind the curve Karen MacGregor The use of information and communication technologies to support learning in South African universities is booming and they are "not very far behind the curve" of developed countries in e-learning, says Stephen Marquard, learning technology coordinator for the University of Cape Town.
AUSTRALIA: Online studying for the remote and on-the-move Geoff Maslen
UK: Virtual lectures? No thanks, say students Diane Spencer The British government is keen to promote e-learning, as are UK universities. Yet research shows that students still prefer face-to-face learning.
FRANCE: Universities lag 'digitally native' students Jane Marshall French universities must urgently catch up with information and communication technologies if they are to satisfy the higher education demands of the advancing generation of 'digitally native' students.
BRITISH COMMONWEALTH: Big changes in small states Nick HoldsworthThe Commonwealth of Learning - the world's only intergovernmental agency solely dedicated to promoting and delivering distance education and open learning - is working with 30 of the British Commonwealth's smaller states to create a 'virtual university'.
Call Your PR Director, Fast When a production company tells your president or dean it wants to profile your college or school in a national television documentary, the offer may not be what it seems
Game Over The University of Florida is the latest to call for an end to beer pong and other popular college drinking games.
Game Over The University of Florida is the latest to call for an end to beer pong and other popular college drinking games.
Subscribe to:
Posts (Atom)