Standard citation analysis metrics (number of citations, h-index, g-index, etc.) are within the bounds of the Web of Knowledge (WoK), Scopus, and Google Scholar (GS) domains. Academics in urban planning should consider the web (i.e., webometrics, discussed later) for books, book chapters, and journal articles, as well as academic gray literature that is produced and consumed by planning academics and planning practitioners. This includes the rest of the academic footprint such as research reports, conference presentations, conference proceedings, and funded research grant materials. Course syllabi are an additional source that are available on the web and often cite academic work and other gray literature on planning topics. Other examples of gray literature for planning academics include studio or workshop projects that are posted to the web and often take the form of professional consulting reports.
It is very likely that blog posts and mentions will become recognized gray literature while potentially becoming accepted as academic products to be evaluated along with other scholarly artifacts. In their discussion about blogging for untenured professors, Hurt and Yin (2006) mention that blogging represents a form of “pre-scholarship” where the contents may be the kernels of future articles (Hurt and Yin 2006, 15). Some planning academics already report contributions to sites such as Planetizen.com on their curriculum vitae (CV) under the heading of “other publications”. The legitimacy of these postings is evidenced by the citation, download, and amount of forum discussion from a mix of planning academics and professionals. It is likely that many of the 50% of non-publishing planning academics that Stiftel, Rukmana, and Alam (2004) mention are producing worthy gray literature that is valuable to planning pedagogy but goes unnoticed by traditional citation analysis and bibliometrics.
References
Hurt, C., and T. Yin. 2006. “Blogging While Untenured and Other Extreme Sports.” Wash. UL Rev. 84: 1235.
Stiftel, B., D. Rukmana, and B. Alam. 2004. “A National Research Council-Style Study.” Journal of Planning Education and Research 24 (1): 6-22.
Next entry: Webometrics
Wednesday, September 11, 2013
Sunday, September 1, 2013
Academic Visibility for Urban Planning
The role of the web for academic research cannot be overstated. Both as a source and destination for scholarship, cyberspace acts as a market for consumers of research, especially for disciplines with popular appeal and applicability. Urban planning is an excellent example of such a discipline. With its professional orientation focusing on the well-being of neighborhoods, cities, regions, and nation, our work is shared among academics and has practical implications that are debated and ultimately implemented (or not) by the public (Stiftel and Mogg 2007). A public well informed about community and regional policies and planning activities can be a desirable end in itself and is international in scale as well (Stiftel and Mukhopadhyay 2007). In addition, justifying tax dollars spent to support scholarly activity should be of great concern to faculty. Cyberspace is and will increasingly be the means by which planning academics promote their contributions to the profession.
A primary activity of academics is discovery through research. Discovery occurs as new thoughts, ideas, or perspectives develop through the research process. These new thoughts, ideas, or perspectives first take place in the mind but must be expressed in a tangible way to be useful to others. Marchionini (2010) describes this process as converting the mental to the physical in the form of useable information. For social scientists, we commonly see the physical expression of these “artifacts” as books, chapters, journal articles, and other types of published reports and documents. More recently these artifacts are in electronic form as blog entries, online articles, electronic multi-media, or other web-based products. As Stiftel and Mogg (2007) argued, the electronic realm has revolutionized scholarly communications for planning academics.
In addition to being a source of research information and a means of dissemination, the web also serves as a vehicle for scholarly evaluation. Traditional quantitative measures of academic output have been used to assess performance, especially in terms of academic promotion and tenure. The message of “publish or perish” within academia stresses the importance of scholarship during the review process. Productivity is a critical factor when arguing for scarce resources, comparing academic programs, and competing in global education and research markets (Goldstein and Maier 2010; Arimoto 2011; Linton, Tierney, and Walsh 2011). Productivity measures are frequently debated and have been used to analyze salary differences between males, females, disciplines, and specialties. Perspectives on productivity are rapidly changing as new modes of electronic research formats and dissemination increase. The web has created opportunities for extending the reach of academic communications, and at the same time presenting challenges for assessing quality and value.
The traditional means of assessing academic productivity and reputation has been citation analysis. Citation analysis for scholarly evaluation has a very extensive literature that weighs appropriateness within and across disciplines as well as offering nuanced discussion of a range of metrics (see for example Garfield 1972; Garfield and Merton 1979; MacRoberts and MacRoberts 1989, 1996; Adam 2002; Moed 2005). Recently, popular metrics like the h-index, g-index, and e-index have been adopted by Google Scholar (GS) to provide web-based citation analysis previously limited to proprietary citation indexes like ISI and Scopus. This is the likely trajectory of citation analysis as open access scholarship becomes more pervasive. There is some debate, however, that GS’s inclusion of gray literature citations mean its analyses draw from a different universe of publications to assess citation frequency and lineage. This article does not dwell on traditional citation analysis techniques because it proposes an expanded approach that moves beyond the bounds of citation indices for assessing overall academic visibility and impact on the web. In short, traditional citation analysis has focused on approximately one-third of faculty activity to assess academic productivity and value (albeit an important one-third), ignoring teaching and outreach/service activities – which are also important expressions of scholarly activity.
The web serves as a repository for artifacts of scholarly information and a forum for discussion, it also provides a market for ideas with a system of feedback about the relevance, reliability, and value of information posted there. Demand or value is expressed through user behavior that generates “reputation” similar to how eBay customers score sellers and buyers, “Likes” on Facebook or social bookmarking, consumer comments, or ratings on product reliability, and page ranking methods like that of Brin and Page (1998). These can function as built-in mechanisms to evaluate many types of academic research. Planning is very well-suited to this model because demand for academic products extends beyond research circles of the discipline to the public who are frequently involved in urban planning processes.
References
Adam, D. 2002. “Citation analysis: The counting house.” Nature 415 (6873): 726-729.
Arimoto, Akira. 2011. Reaction to Academic Ranking: Knowledge Production, Faculty Productivity from an International Perspective. In University Rankings: The Changing Academy – The Changing Academic Profession in International Comparative Perspective, ed. Jung Cheol Shin, Robert K. Toutkoushian, and Ulrich Teichler, 3:229-258. Springer Netherlands. http://dx.doi.org/10.1007/978-94-007-1116-7_12.
Brin, S., and L. Page. 1998. “The anatomy of a large-scale hypertextual Web search engine.” Computer networks and ISDN systems 30 (1-7): 107-117.
Garfield, E. 1972. Citation analysis as a tool in journal evaluation. In American Association for the Advancement of Science.
Garfield, E., and R. K Merton. 1979. Citation indexing: Its theory and application in science, technology, and humanities. Vol. 8. Wiley New York.
Goldstein, H., and G. Maier. 2010. “The use and valuation of journals in planning scholarship: Peer assessment versus impact factors.” Journal of Planning Education and Research 30 (1): 66.
Linton, J. D, R. Tierney, and S. T Walsh. 2011. “Publish or Perish: How Are Research and Reputation Related?” Serials Review.
MacRoberts, M. H, and B. R MacRoberts. 1989. “Problems of citation analysis: A critical review.” Journal of the American Society for Information Science 40 (5): 342-349.
———. 1996. “Problems of citation analysis.” Scientometrics 36 (3): 435-444.
Marchionini, G. 2010. “Information Concepts: From Books to Cyberspace Identities.” Synthesis Lectures on Information Concepts, Retrieval, and Services 2 (1): 1-105.
Moed, H. F. 2005. Citation analysis in research evaluation. Vol. 9. Kluwer Academic Pub.
Stiftel, B., and R. Mogg. 2007. “A planner’s guide to the digital bibliographic revolution.” Journal of the American Planning Association 73 (1): 68-85.
Stiftel, B., and C. Mukhopadhyay. 2007. “Thoughts on Anglo-American hegemony in planning scholarship: Do we read each other’s work?” Town Planning Review 78 (5): 545-572.
Next installment: "Visibility of Planning Gray Literature"
A primary activity of academics is discovery through research. Discovery occurs as new thoughts, ideas, or perspectives develop through the research process. These new thoughts, ideas, or perspectives first take place in the mind but must be expressed in a tangible way to be useful to others. Marchionini (2010) describes this process as converting the mental to the physical in the form of useable information. For social scientists, we commonly see the physical expression of these “artifacts” as books, chapters, journal articles, and other types of published reports and documents. More recently these artifacts are in electronic form as blog entries, online articles, electronic multi-media, or other web-based products. As Stiftel and Mogg (2007) argued, the electronic realm has revolutionized scholarly communications for planning academics.
In addition to being a source of research information and a means of dissemination, the web also serves as a vehicle for scholarly evaluation. Traditional quantitative measures of academic output have been used to assess performance, especially in terms of academic promotion and tenure. The message of “publish or perish” within academia stresses the importance of scholarship during the review process. Productivity is a critical factor when arguing for scarce resources, comparing academic programs, and competing in global education and research markets (Goldstein and Maier 2010; Arimoto 2011; Linton, Tierney, and Walsh 2011). Productivity measures are frequently debated and have been used to analyze salary differences between males, females, disciplines, and specialties. Perspectives on productivity are rapidly changing as new modes of electronic research formats and dissemination increase. The web has created opportunities for extending the reach of academic communications, and at the same time presenting challenges for assessing quality and value.
The traditional means of assessing academic productivity and reputation has been citation analysis. Citation analysis for scholarly evaluation has a very extensive literature that weighs appropriateness within and across disciplines as well as offering nuanced discussion of a range of metrics (see for example Garfield 1972; Garfield and Merton 1979; MacRoberts and MacRoberts 1989, 1996; Adam 2002; Moed 2005). Recently, popular metrics like the h-index, g-index, and e-index have been adopted by Google Scholar (GS) to provide web-based citation analysis previously limited to proprietary citation indexes like ISI and Scopus. This is the likely trajectory of citation analysis as open access scholarship becomes more pervasive. There is some debate, however, that GS’s inclusion of gray literature citations mean its analyses draw from a different universe of publications to assess citation frequency and lineage. This article does not dwell on traditional citation analysis techniques because it proposes an expanded approach that moves beyond the bounds of citation indices for assessing overall academic visibility and impact on the web. In short, traditional citation analysis has focused on approximately one-third of faculty activity to assess academic productivity and value (albeit an important one-third), ignoring teaching and outreach/service activities – which are also important expressions of scholarly activity.
The web serves as a repository for artifacts of scholarly information and a forum for discussion, it also provides a market for ideas with a system of feedback about the relevance, reliability, and value of information posted there. Demand or value is expressed through user behavior that generates “reputation” similar to how eBay customers score sellers and buyers, “Likes” on Facebook or social bookmarking, consumer comments, or ratings on product reliability, and page ranking methods like that of Brin and Page (1998). These can function as built-in mechanisms to evaluate many types of academic research. Planning is very well-suited to this model because demand for academic products extends beyond research circles of the discipline to the public who are frequently involved in urban planning processes.
References
Adam, D. 2002. “Citation analysis: The counting house.” Nature 415 (6873): 726-729.
Arimoto, Akira. 2011. Reaction to Academic Ranking: Knowledge Production, Faculty Productivity from an International Perspective. In University Rankings: The Changing Academy – The Changing Academic Profession in International Comparative Perspective, ed. Jung Cheol Shin, Robert K. Toutkoushian, and Ulrich Teichler, 3:229-258. Springer Netherlands. http://dx.doi.org/10.1007/978-94-007-1116-7_12.
Brin, S., and L. Page. 1998. “The anatomy of a large-scale hypertextual Web search engine.” Computer networks and ISDN systems 30 (1-7): 107-117.
Garfield, E. 1972. Citation analysis as a tool in journal evaluation. In American Association for the Advancement of Science.
Garfield, E., and R. K Merton. 1979. Citation indexing: Its theory and application in science, technology, and humanities. Vol. 8. Wiley New York.
Goldstein, H., and G. Maier. 2010. “The use and valuation of journals in planning scholarship: Peer assessment versus impact factors.” Journal of Planning Education and Research 30 (1): 66.
Linton, J. D, R. Tierney, and S. T Walsh. 2011. “Publish or Perish: How Are Research and Reputation Related?” Serials Review.
MacRoberts, M. H, and B. R MacRoberts. 1989. “Problems of citation analysis: A critical review.” Journal of the American Society for Information Science 40 (5): 342-349.
———. 1996. “Problems of citation analysis.” Scientometrics 36 (3): 435-444.
Marchionini, G. 2010. “Information Concepts: From Books to Cyberspace Identities.” Synthesis Lectures on Information Concepts, Retrieval, and Services 2 (1): 1-105.
Moed, H. F. 2005. Citation analysis in research evaluation. Vol. 9. Kluwer Academic Pub.
Stiftel, B., and R. Mogg. 2007. “A planner’s guide to the digital bibliographic revolution.” Journal of the American Planning Association 73 (1): 68-85.
Stiftel, B., and C. Mukhopadhyay. 2007. “Thoughts on Anglo-American hegemony in planning scholarship: Do we read each other’s work?” Town Planning Review 78 (5): 545-572.
Next installment: "Visibility of Planning Gray Literature"
Subscribe to:
Posts (Atom)