Collaboration: A Primer

Prepared for the ACS Strategic Planning Committee

by Amanda Hagood, Director of Blended Learning, Associated Colleges of the South, and Grace Pang, Program Officer, National Institute for Technology in Liberal Education

Introduction

This primer was developed from a study of sixteen case studies in digitally-mediated collaboration and the liberal arts published by the Associated Colleges of the South (ACS) and the National Institute for Technology in Liberal Education (NITLE) in the summer of 2014. Though the case studies covered topics as diverse as designing and implementing a hybrid course in Asian Studies or launching a program in digital humanities, each provided a fascinating example of how small institutions can marshal their oftentimes limited resources and personnel to achieve extraordinary things. The key to each project’s success lies in the strategy of collaboration—though, as we will demonstrate, collaboration exists along a continuum consisting of many different modalities for working together. This primer, drawn from a thoroughgoing analysis of these projects, will present four exemplary projects and will ask you to consider how their goals, strategies, and tactics reflect upon the goals, strategies, and tactics that should appear in the ACS’s 2020 Vision.

The aims of this primer are threefold:

  • To report why and how faculty and staff within and across ACS institutions are collaborating
  •  To explore how the goals, strategies, and tactics used by these practitioners align with the ACS’s mission to support the liberal arts by creating collaborative opportunities that improve the quality, while reducing the cost, of liberal arts education.
  • To stimulate the Strategic Planning Committee’s thinking about why and how our member institutions could collaborate.

Read more

From Data to Wisdom: Humanities Research and Online Content

by Michael Lesk, Rutgers University

Originally Posted December 16th, 2007

1. Introduction
President Clinton’s 1998 State of the Union Address called for “an America where every child can stretch a hand across a keyboard and reach every book ever written, every painting ever painted, every symphony ever composed.”[1] If that dream is realized, we would have the resources for all humanistic research online. What difference would that make for the humanities?

The widespread availability of online data has already changed scholarship in many fields, and in scientific research the entire paradigm is changing, with experiments now being conducted before rather than after hypotheses are proposed, simply because of the massive amounts of available data. But what is likely to happen in the humanities? So far, much of the work on “cyberinfrastructure” in the humanities has been about accumulating data. This is an essential part of the process, to be sure, but one would like to see more research on the resulting files. At the moment, we have a library that accumulates more books than are read. T. S. Eliot wrote “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?”[2] Modern computer scientists have tended to turn this into a four-stage process, from data to information to knowledge to wisdom. We are mostly still at the stage of having only data.

Read more

Open Access and Institutional Repositories: The Future of Scholarly Communications

by Greg Crane, Winnick Family Chair of Technology and Entrepreneurship, Tufts University

Originally Posted December 16th, 2007

Institutional repositories were the stated topic for a workshop convened in Phoenix, Arizona earlier this year (April 17-19, 2007) by the National Science Foundation (NSF) and the United Kingdom’s Joint Information Systems Committee (JISC). While in their report on the workshop, The Future of Scholarly Communication: Building the Infrastructure for Cyberscholarship, Bill Arms and Ron Larsen build out a larger landscape of concern, institutional repositories remain a crucial topic, which, without institutional cyberscholarship, will never approach their full potential.

Read more

Cyberinfrastructure as Cognitive Scaffolding: The Role of Genre Creation in Knowledge Making

by Janet Murray, Georgia Tech

Originally Posted December 16th, 2007 

Professor Janet H. Murray is an internationally recognized interactive designer, the director of Georgia Tech’s Masters Degree Program in Information Design and Technology and Ph.D. in Digital Media, and a member of Georgia Tech’s interdisciplinary GVU Center. She is the author of Hamlet on the Holodeck: The Future of Narrative in Cyberspace (Free Press, 1997; MIT Press 1998), which has been translated into 5 languages, and is widely used as a roadmap to the coming broadband art, information, and entertainment environments. She is currently working on a textbook for MIT Press, Inventing the Medium: A Principled Approach to Interactive Design and on a digital edition of the Warner Brothers classic, Casablanca, funded by NEH and in collaboration with the American Film Institute. In addition, she directs an eTV Prototyping Group, which has worked on interactive television applications for PBS, ABC, and other networks. She is also a member Georgia Tech’s Experimental Game Lab. Murray has played an active role in the development of two new degree programs at Georgia Tech, both 0f which were launched in Fall 2004: the Ph.D. in Digital Media, and the B.S. in Computational Media. In spring 2000 Janet Murray was named a Trustee of the American Film Institute, where she has alsoserved as a mentor in the Enhanced TV Workshop a program of the AFI Digital Content Lab. She holds a Ph.D. in English from Harvard University, and before coming to Georgia Tech in 1999 taught humanities and led advanced interactive design projects at MIT. Murray’s primary fields of interest are digital media curricula, interactive narrative, story/games, interactive television, and large-scale multimedia information spaces. Her projects have been funded by IBM, Apple Computer, the Annenberg-CPB Project, the Andrew W. Mellon Foundation, and the National Endowment for the Humanities.


Information infrastructure is a network of cultural artifacts and practices.[1] A database is not merely a technical construct; it represents a set of values and it also shapes what we see and how we see it. Every time we name something and itemize its attributes, we make some things visible and others invisible. We sometimes think of infrastructure, like computer networks, as outside of culture. But pathways, whether made of stone, optical fiber or radio waves, are built because of cultural connections. How they are built reflects the traditions and values as well as the technical skills of their creators. Infrastructure in turn shapes culture. Making some information hard to obtain creates a need for an expert class. Counting or not counting something changes the way it can be used. Increasingly it is the digital infrastructure that shapes our access to information and we are just beginning to understand how the pathways and containers and practices we build in cyberspace shape knowledge itself.

Read more

The Future of Art History: Roundtable

by Jennifer Curran,

Originally Posted December 16th, 2007

Introduction: David Green

Principal, Knowledge Culture
Three art historians were invited to think about how their discipline, and their teaching and research within that discipline, might evolve with access to a rich cyberinfrastructure.

Participants were encouraged to think through what might happen to their practice of art history if:
–they had easy access to high-quality, copyright-cleared material in all media;
–they could share research and teaching with whomever they wanted;
–they had unrestricted access to instructional technologists who could assist with technical problems, inspire with teaching ideas and suggest resources they might not otherwise have known about.

What would they do with this freedom and largesse? What kinds of new levels of research would  be possible (either solo or in collaborative teams); what new kinds of questions might they be able to answer; how would they most want to distribute the results of their scholarship; who would the audience be; and would there be a new dynamic relationship with students in and out of the classroom?

Panelist 1: Guy Hedreen, Professor of Art History, Williams College
On The Next Generation of Digital Images Available to Art Historians

Panelist 2: Dana Leibsohn, Associate Professor of Art, Smith College
On the Technologies of Art History

Panelist 3: Amelia Carr, Associate Professor of Art History, Allegheny College
Overcoming the Practice of Visual Scarcity

Read more

Museums, Cataloging & Content Infrastructure: An Interview with Kenneth Hamma

by David Green, Principal, Knowledge Culture

Originally Published December 16th, 2007

Ken Hamma is a digital pioneer in the global museum community. A classics scholar, Hamma joined the Getty Trust in 1987 as Associate Curator of Antiquities for the Getty Museum. He has since had a number of roles there, including Assistant Director for Collections Information at the Getty Museum, Senior Advisor to the President for Information Policy and his current position, Executive Director for Digital Policy and Initiatives at the Getty Trust.

David Green: Ken, you are in a good position to describe the evolution of digital initiatives at the Getty Trust as you’ve moved through its structure. How have digital initiatives been defined at the Getty and how are they faring at the institutional level as a whole, as the stakes and benefits of full involvement appear to be getting higher? 

Ken HammaBeing or becoming digital as short-hand for the thousands of changes institutions like this go through as they adopt new information and communication technologies has long been discussed at the Getty from the point of view of the technology. And it did once seem that applying technology was merely doing the same things with different tools when, in fact, we were starting to embark upon completely new opportunities. It also once seemed that the technology would be the most expensive part. Now we’ve learned it’s not. It’s content, development and maintenance, staff training, and change management that are the expensive bits.

Read more

College Museums in a Networked Era–Two Propositions

by John Weber, Skidmore College

John Weber is the Dayton Director of the Frances Young Tang Teaching Museum and Art Gallery at Skidmore College, an interdisciplinary museum opened in 2000 to create links between contemporary art and other disciplines as part of the teaching effort at Skidmore. As director of the museum, he supervises the Tang’s staff and oversees exhibitions, programs, collections, and the Tang website, as well as curating and writing for museum publications. Weber is also a member of the Skidmore faculty and teaches in the art history program. Before coming to Skidmore in 2004, he was the curator of education and public programs at the San Francisco Museum of Modern Art from 1993 to 2004, where he spearheaded the design of the Koret Education Center and founded the museum’s interactive educational technologies program. From 1987 to 1993 Weber served as curator of contemporary art at the Portland Art Museum in Oregon.

Originally Published December 16th, 2007

To begin, let’s take it as a given that the “cyberinfrastructure” we are writing about in this edition ofAcademic Commons is both paradigmatically in place, and yet in some respects technologically immature. The internet and the intertwined web of related technologies that support wired and wireless communication and data storage have already altered our ways of dealing with all manner of textual and audiovisual experience, data, modes of communication, and information searching and retrieval. Higher education is responding, but at a glacial pace, particularly in examining new notions of publishing beyond those which have existed since the printed page. Technologies such as streaming and wireless video remain crude, but digital projectors that handle still image data and video are advancing rapidly, and the gap between still and video cameras continues to close. Soon I suspect there will simply be cameras that shoot in whatever mode one chooses (rather than “camcorders” and “digital cameras”), available in a variety of consumer and professional versions and price points. Already, high definition projectors and HD video are a reality, but they have yet to permeate the market. They will soon, with a jump in image quality that will astonish viewers used to current recording and projection quality.

Read more

Cyberinfrastructure and the Sciences at Liberal Arts Colleges

by Francis Starr, Wesleyan University

Professor Starr is a computational and theoretical physicist at Wesleyan University. In the last 10 years, he has published roughly 70 articles focusing on liquids, glasses, gels, polymers, and biologically inspired nanomaterials. Due to the computational demands of his research, Prof. Starr has been involved in developing computing infrastructure since he was a graduate student. He recently joined with several other faculty and the university ITS to provide a university-wide cluster and a companion educational center.

Originally Published December 16th, 2007

Introduction
The technical nature of scientific research led to the establishment of early computing infrastructure and today, the sciences are still pushing the envelope with new developments in cyberinfrastructure. Education in the sciences poses different challenges, as faculty must develop new curricula that incorporate and educate students about the use of cyberinfrastructure resources. To be integral to both science research and education, cyberinfrastructure at liberal institutions needs to provide a combination of computing and human resources. Computing resources are a necessary first element, but without the organizational infrastructure to support and educate faculty and students alike, computing facilities will have only a limited impact. A complete local cyberinfrastructure picture, even at a small college, is quite large and includes resources like email, library databases and on-line information sources, to name just a few. Rather than trying to cover such a broad range, this article will focus on the specific hardware and human resources that are key to a successful cyberinfrastructure in the sciences at liberal arts institutions. I will also touch on how groups of institutions might pool resources, since the demands posed by the complete set of hardware and technical staff may be larger than a single institution alone can manage. I should point out that many of these features are applicable to both large and small universities, but I will emphasize those elements that are of particular relevance to liberal arts institutions. Most of this discussion is based on experiences at Wesleyan University over the past several years, as well as plans for the future of our current facilities.

Read more

The Bates College Imaging Center: A Model for Interdisciplinarity and Collaboration

by Matthew J. Coté, Associate Professor of Chemistry and Director of the Bates College Imaging and Computing Center, Bates College

Originally Published December 16th, 2007

The Bates College Imaging and Computing Center (known on campus simply as the Imaging Center) is a new interdisciplinary facility designed to support Bates’s vision of a liberal arts education, as codified by its newly-adopted General Education Program. This program reflects the increasingly porous and mutable nature of disciplinary boundaries and emphasizes the effectiveness of teaching writing as a means of improving students’ ability to think, reason and communicate. The Imaging Center strives to further expand the reach of this program by promoting visual thinking and communication–serving as a catalyst for interdisciplinary and transdisciplinary work. In many ways the Center embodies most of the ideas underpinning Bates’s new General Education Program and is a model on this campus for the kind of transformative work cyberinfrastructure will enable.

Cote_Figure1_2007

Floorplan image courtesy of the Bates College Imaging and Computing Center.

Read more

Cyberinfrastructure: Leveraging Change at our Institutions. An interview with James J. O’Donnell

by David Green, Knowledge Culture

Originally Published December 16th, 2007

James O’Donnell, Provost of Georgetown University, is a distinguished classics scholar (most recently author of Augustine: A New Biography), who has contributed immensely to critical thinking about the application of new technologies to the academic realm. In 1990, while teaching at Bryn Mawr College, he co-founded the Bryn Mawr Classical Review, one of the earliest online scholarly journals, and while serving as Professor of Classical Studies at the University of Pennsylvania, he was appointed Penn’s Vice Provost for Information Systems and Computing. In 2000 he chaired a National Academies committee reviewing information technology strategy at the Library of Congress, resulting in the influential reportLC21: A Digital Strategy for the Library of Congress. One of his most influential books, Avatars of the Word (Harvard, 1998) compares the impact of the digital revolution to other comparable paradigmatic communications shifts throughout history.

David Green: We’re looking here at the kinds of organizational design and local institutional evolution that will need to happen for liberal arts (and other higher-education) institutions to take advantage of a fully-deployed international cyberinfrastructure. How might access to massive distributed databases and to huge computational and human resources shift the culture, practice and structure of these (often ancient) institutions? How will humanities departments be affected–willingly or unwillingly? Will they lead the way or will they need to be coaxed forward?

James O’Donnell: I think the issue you’re asking about here boils down to the question, “What problem are we really trying to solve?” And I think I see the paradox. The NSF Cyberinfrastructure Report, addressed to the scientific community, could assume a relatively stable community of people whose needs are developing in relatively coherent ways. If wise heads get together and track the development of those needs and their solutions, you can imagine it would then just be an ordinary public policy question: what things do you need, how do you make selections, how do you prioritize, what do you do next? NSF has been in this business for several decades. But when you come to the humanities (and full credit to Dan Atkins, chair of the committee that issued the report, for saying “and let’s not leave the other guys behind”) and you ask “what do these people need?” you come around to the question (that I take it to be the question you are asking of us) “Are we sure these people know they need what they do need?”

Read more