Monday, June 28, 2010

Free our data: For democracy's sake

The open data movement is growing apace. What better demonstration of this than news that the UK coalition government is making its Combined Online Information System (COINS) freely available on the Internet, inviting people not only to access the data but to re-use it too?

COINS, The Guardian newspaper points out, is one of the world's biggest government databases and provides, "the most detailed record of public spending imaginable. Some 24m individual spending items in a CSV file of 120GB presents a unique picture of how the [UK] government does its business."

For The Guardian the release of COINS marks a high point in a crusade it began in March 2006, when it published an article called "Give us back our crown jewels" and launched the Free Our Data campaign. Much has happened since. "What would have been unbelievable a few years ago is now commonplace," The Guardian boasted when reporting on the release of COINS.

Why did The Guardian start the Free Our Data campaign? Because it wanted to draw attention to the fact that governments and government agencies have been using taxpayers' money to create vast databases containing highly valuable information, and yet have made very little of this information publicly available.

Where the data has been made available access to it has generally been charged for. Moreover, it has usually been released under restrictive copyright licences prohibiting redistribution, and so preventing third parties from using it to create useful new services.

The end result, The Guardian believes, is that the number and variety of organisations able to make use of the data has been severely curtailed and innovation stifled. As the paper explains, "Making that data available for use for free — rather as commercial companies such as Amazon and Google do with their catalog and maps data — would vastly expand the range of services available."

And it is this argument that has become the rallying cry of the burgeoning open data movement.

But is it true? Is there evidence to demonstrate that by keeping publicly-funded data under wraps governments have been stifling innovation? And does the free availability of government data inevitably lead to a flowering of new information products and services?

The economic argument

In the hope of answering these questions Italian open data advocate Marco Fioretti will shortly be undertaking a research project in conjunction with the Laboratory of Economics and Management (LEM) at the Sant'Anna School of Advanced Studies in Pisa.

The project will form part of an EU-funded work package entitled "Open Data for an Open Society, itself part of a larger initiative called Dynamics of Institutions and Markets in Europe (DIME). DIME is sponsored by the 6th Framework Programme of the European Union.

Explaining the background to Fioretti's project the manager of DIME's open data package, and Associate Professor of Economics at the Sant'Anna School, Dr Giulio Bottazzi says: "We ran an open call for the assignment of part of the activity, specifically the design and realisation of a survey to help assess the present situation concerning the 'openness' of data management in public institutions. Dr Fioretti applied for and obtained the contract — essentially because of his CV and his publication record in the area of open source software and open document standards."

Importantly, Fioretti has a special interest in how better use can be made of public sector information (PSI). And he believes that simply making public data freely available is insufficient. It also needs to be made available using open standards and open licenses. In other words, making digital information freely available is only half the task. Unless it is released in non-proprietary formats, and licensed in a way that allows adaptation and re-use, its usefulness is significantly curtailed.

Fioretti plans to restrict his research to local government data, since he believes it is more likely to be consistent. As he explained recently to O'Reilly Media editor Andy Oram, "the structure of government projects and costs are more similar from one city to another — even across national EU borders — than from one national government to another."

The project will consist of three main phases. First, Fioretti will produce a report discussing the role of fully accessible and reusable digital raw data in an open society. This will be based on examples taken both from the European Union and the rest of the world.

Second, during the summer he will conduct an online survey — which will be hosted on the LEM website. The survey will aim to establish how many EU municipalities and regions are already making their raw data and procedures available by means of open standards and open licenses.

Finally, he will write a further report analysing the results of the survey, and providing some guidelines and best practices for improving full access to digital data.

The grant made available to Fioretti is a modest one — around €12,000 ($14,767), plus an additional €6,000 discretionary fund — but it would seem to be further evidence that the open data movement is gaining mindshare.

In the meantime Fioretti is very keen to hear about real-world examples involving local businesses — both those who have built a successful business (anywhere in the world) by exploiting the free availability of local government data, and those (in Europe) who have struggled to create a viable business as a result of the inaccessibility of this data.

Fioretti can be contacted on mfioretti@nexaima.net.

As noted, Fioretti's research will test the thesis of The Guardian's Free Our Data campaign — i.e. if governments make their data freely available will it encourage businesses to develop new value-added products? And will this in turn spur innovation and create new jobs? Essentially it is an economic argument.

Transparency

But we should not overlook the fact that there are other reasons for governments to open their databases to citizens. Indeed, some might argue that the economic argument should be seen as of secondary importance only. A far more compelling reason for "freeing our data", they might add, is that it increases government transparency, and so is an inherently democratic step to take.

To do him justice, while he appears to have prioritised the economic case in his project, Fioretti is not blind to the transparency argument. As he points out on his web site, in addition to providing access to valuable government-created information and encouraging innovation, open data makes it easier for citizens to monitor their government's activities and how their tax dollars are being spent. "Modern software technologies and data networks make it both possible and relatively inexpensive to publish online tenders, regulations, documents, procedures and many other raw public data, from digital maps to pollution measurements," he says. "Making this information really accessible, that is online, in open formats and under open licenses, can both improve transparency in government and foster local economical and cultural activities."

In putting the transparency case to O'Reilly's Andy Oram, Fioretti cited the planned construction of the Strait of Messina Bridge (at an estimated cost of €6.1 billion). When the government announces how many tax dollars it plans to spend on a project like this, Fioretti asked, how can the public know that the costs are reasonable if it does not have access to all the data?

Evidently in agreement with Fioretti, Oram expanded on the argument: "[C]ontracts must be very specific about the delivery of data that [a government] commissions — and not just the data, but the formulas and software used to calculate results. For instance, if a spreadsheet was used in calculating the cost of a project, the government should release the spreadsheet data and formulas to the public in an open format so that experts can check the calculations."

Certainly it was in the interests of transparency that the UK government released COINS. As the H M Treasury web site puts it, "The coalition agreement made clear that this Government believes in removing the cloak of secrecy from government and throwing open the doors of public bodies, enabling the public to hold politicians and public bodies to account. Nowhere is this truer than in being transparent about the way in which the Government spends your money. The release of COINS data is just the first step in the Government's commitment to data transparency on Government spending."

Coming in the wake of the so-called "MPs' expenses scandal" in Britain, the UK government's decision is far from surprising. What that scandal clearly demonstrated was that Freedom of Information (FoI) legislation alone cannot provide adequate transparency about the way in which governments spend taxpayers' money.

Indeed, it was only after several legal challenges that the UK government eventually conceded the need to be more transparent about MPs' expenses in any case. Even then, it continued to prevaricate and it was only after an insider leaked data to the Daily Telegraph that the public become fully apprised of how irresponsibly British politicians had been spending taxpayer's money for their own personal gain — including using public money to help them fund unnecessary second homes, to buy toilet seats, duck houses etc. etc.

This suggests that it would be a serious mistake to limit the case for open data to the economic argument alone. Governments also need to open their databases so that the public can know exactly how its money is being spent, and exactly how well the government is managing the country on its behalf.

Participation

But there is a third — and arguably even more important — reason why governments should open their databases: Doing so will help citizens participate more directly in the government of their country, and at an important moment in history.

As the publisher Tim O'Reilly points out, if governments were to exploit the digital network effectively they could engage the public in the political process in ways never previously possible in a modern society. Citizens could, for instance, "crowdsource" social, political and economic problems, take over responsibility for tasks traditionally seen as the prerogative of the state (by means of what O'Reilly calls "citizen self-organisation"), and even play a direct role in political decision-making.

This would be a radical change. The traditional top-down model of government assumes that public participation in the political process is necessarily circumscribed, since in large analogue societies collective decision making does not scale. This has seen citizens consigned to simply casting their votes and then sitting back while politicians govern in their name. One consequence of this has been the emergence of a professional political class that now views itself as standing over and above citizens, who tend to be treated as inferior, or simply as children, rather than fellow citizens. Essentially the electorate has been infantilised.

Yet as O'Reilly points out, "government is, at bottom, a mechanism for collective action." And in the age of the Web it is possible to make the political process more horizontal and egalitarian. Rather than viewing government as a hierarchical system in which only elected officials (with the help of professional civil servants) come up with solutions, make decisions, and set the rules, government should now be viewed as a platform to facilitate collective action — much as the Web itself has become a platform across which multiple and diverse applications can run without fear or favour (as advocates of net neutrality like to put it, "all bits are equal").

For this reason, says O'Reilly, the key question for governments today should be, "How do you design a system in which all of the outcomes aren't specified beforehand, but instead evolve through interactions between government and its citizens, as a service provider enabling its user community?"

Clearly this would not be possible unless everyone had access to all the relevant data, much as those building web-based services need to know the protocols of the Web in order to participate in the network economy. This suggests that open data should not be viewed as the endgame, but a necessary precondition for something far more radical.

The necessary transition, however, is unlikely to occur naturally: just as the Web is only open because its designers created an open model, so creating an open platform for democratic governance would need to be a deliberate decision. And evidence suggests that politicians will resist greater openness and transparency, since it would require them to give up some of their traditional power and authority.

In this regard perhaps open government advocates have been somewhat naïve. When in 2008 Obama won the US presidential election, for instance, they assumed the battle had been won. Arguing that Obama's success was because his campaign had adopted the open source approach to electioneering pioneered by Howard Dean in 2004 (under the guiding hand of Joe Trippi) they anticipated that Obama introduce the same openness in the White House, and so revolutionise the way in which presidents govern in the US.

In fact, Obama appeared to promise as much. As he put it during the election campaign: "We must use all available technologies and methods to open up the federal government, creating a new level of transparency to change the way business is conducted in Washington, and giving Americans the chance to participate in government deliberations and decision making in ways that were not possible only a few years ago."

With that end in mind, Obama promised to post pending legislation online for comments. And on entering the White House he put his weekly addresses up on YouTube, oversaw the creation of a White House blog, and the launch of several federal web sites — including Data.gov — where the public could access "high value, machine readable datasets generated by the Executive Branch of the Federal Government".

Clearly open data was viewed as a given in a Government 2.0 environment.

Caught up in the excitement, Newsweek called it Government 2.0: "Instead of a one-way system in which government hands down laws and provides services to citizens, why not use the Internet to let citizens, corporations and civil organisation work together with elected officials to develop solutions?" it asked.

Within weeks of Obama's arrival in the White House, however, the promised transparency began to give way to traditional top-down government, secrecy, and back-room deals.

Earlier this year, for instance, Obama was accused of reneging on a promise to make his health care negotiations freely available on C-SPAN (The Cable-Satellite Public Affairs Network, also freely available on the Internet).

And a year after the launch of Data.gov the site was being derided for offering little more than "thoughtless data dumps".

In reality, concluded cynics, Obama had simply paid lip service to openness and transparency in order to gain power.

However, it is surely more complicated than that. As O'Reilly points out, "we can be misled by the notion of participation to think that it's limited to having government decision-makers 'get input' from citizens … It's a trap for outsiders to think that Government 2.0 is a way to use new technology to amplify the voices of citizens to influence those in power, and by insiders as a way to harness and channel those voices to advance their causes."

In the meantime, the gap between promise and delivery in the Obama administration appears to be widening. Earlier this month, for instance, the New York Times ran an article suggesting that the Obama administration is actually proving more, not less, secretive than previous ones — and cracking down on government leaks in an unprecedented manner. A recent case [in which an intelligence bureaucrat faces 10 felony charges for handing over classified documents to a blogger], the paper suggested, "epitomizes the politically charged debate over secrecy and democracy."

It added, "In 17 months in office, President Obama has already outdone every previous president in pursuing leak prosecutions. His administration has taken actions that might have provoked sharp political criticism for his predecessor, George W. Bush."

Dangerous times

What has gone wrong? In pondering on this question earlier this year the online magazine Slate listed a number of possible reasons — including the White House's use of poor technology, an inability to let go of the traditional top-down approach and, citing an article by Micah Sifry — the co-founder of the annual tech/politics conference Personal Democracy Forum — the possibility that much of the hype surrounding the Obama campaign's web savvy was hype and nothing else.

The disappointment of open government advocates was palpable at the 2010 Personal Democracy Forum earlier this month. Indeed, some have become sufficiently disenchanted that they have come to conclude that, rather than facilitating greater democracy, the Web is acting against it. For this reason, suggested co-founder of the Electronic Frontier Foundation (EFF) John Perry Barlow, those disappointed in Obama should blame the Internet, not the president.

"The political system has partly broken because of the Internet," he argued. "It's made it impossible to govern anything the size of the nation-state". As a consequence, he predicted, "We're going back to the city-state. The nation-state is ungovernably information-rich."

Those not steeped in the culture of American libertarianism will doubtless view Barlow's conclusion as little more than the dystopian fantasy of a cyber-guru who had expected so much more of the Internet. Certainly it runs counter to everything we have learned about the ability of the Web to cope with scale.

So what do we conclude? Is open data really only ever about job creation and innovation? Should we give up on the democratic potential of the Internet? Can we afford to?

Undoubtedly it is possible that governments could release more of their data without making a significant change to the traditional political system. Certainly we cannot rely on them to act in the best interests of their electorate; and so we cannot rely on them to become more transparent, or at least not without a great deal of pressure — much as British politicians had to be forced to change the way in which they managed their expenses system.

But as public cynicism grows the need for citizens to be brought into the political process in a more meaningful way could become pressing. Today's growing disenchanted with governments and politicians is a dangerous development — and comes at a critical historical moment.

For as governments around the world are forced to cut spending and raise taxes in response to the global financial crisis, citizens could become sufficiently alienated that political unrest begins to destabilise democratic governments.

Already we have seen pressure placed on the political stability of Greece, signs of incipient civil unrest in a number of other European countries (e.g. Germany, Spain, Portugal, Italy, and doubtless soon in the UK in the wake of the new government's emergency budget). And the now routine protests at international forums like G-20 are a further reminder of how more and more citizens are becoming alienated from the political process.

The threat that the financial crisis poses to European democracies was highlighted recently by the President of the European Commission José Manuel Durão Barroso — who warned that crisis-hit countries in southern Europe could fall victim to military coups or popular uprisings as interest rates soar and public services collapse because their governments run out of money.

In short, these are dangerous times. So we cannot afford to retreat into Barlowesque libertarian fantasies. Rather we should be pointing out that opening up government databases is just the first necessary step for engaging citizens more effectively in the political decision-making process. After all, when people are involved in decision making they are more likely to buy into the proposed solutions — and that surely is the raison d'être for collective decision making.

Moreover, any solution emerging from citizens themselves is more likely to be a robust one. As Trippi points out in his book The Revolution will not be Televised (describing the Dean Campaign) some of the best ideas adopted by Dean came not from campaign insiders, but from ordinary citizens. "It became pretty obvious quickly that a couple of dozen sleep-deprived political junkies couldn't possibly match the brainpower and resourcefulness of six hundred thousand Americans," explains Trippi. "We couldn't see every hole and every flaw that they could see."

Likewise, describing Dean's radical decision to post details of the campaign's fund-raising on the Web, Trippi explains that the logic was to, "[I]nvite people in and open up the books. Give them the knowledge and information — how much money we wanted to raise — and they'd take the responsibility for it." And, he adds, they did take responsibility — as people do when you stop infantilising them.

Yes, governments should make their databases freely available to citizens. Yes, new businesses and services will doubtless be created as a result. Yes, that will surely benefit society at large. But open data should be viewed as merely a preliminary to a more far-reaching change in the way democracies are governed. The more important task is to re-engage citizens in the political process, and empower them to take part in collective decision making. That is not possible without open data, but open data alone will not make it happen.

In short, open data is a good cause to advocate, and Fioretti's initiative is both important and timely. But we need to broaden the discussion. It is not a topic that should be viewed exclusively through the lens of economics and entrepreneurship.

As such, the message to governments should be "Free our data: For democracy's sake".

Tuesday, June 08, 2010

Reed Elsevier: Need for a progressive divestiture?

In February Reed Elsevier announced higher than expected pre-tax profits for 2009 of £1,279 million. Commenting on the results, Anthony Habgood, Chairman of Reed Elsevier, said, "We are pleased that our 2009 results were relatively robust given the depth of the global recession. In addition, during the second half we substantially strengthened our balance sheet both through an equity placing and through good cash generation … The late cycle nature of some of our markets makes for a tough environment in 2010 but the fundamentals of our businesses are strong, our balance sheet is in good shape and new management is in place with the background, experience and ambition to drive the business forward.”

Most analysts appear to agree with this assessment, and according to the Financial Times web site Reed Elsevier has a current overall rating of A (the highest).

One analyst, however, takes a more gloomy view of things. In two recent equity research reports on Reed Elsevier, Claudio Aspesi — an analyst based at the sell-side research firm Sanford Bernstein — argues that the company is "in denial on the magnitude of the issue potentially affecting scientific publishing", and suggests that it is time to "pursue a progressive break-up of the company". I emailed Aspesi to find out more.

clip_image002

Claudio Aspesi

RP: Can you say something about Bernstein Research, your role in the company, and the purpose of the equity research reports you produce?

CA: Sanford Bernstein is widely recognized as Wall Street's premier sell-side research firm. I am the Senior analyst covering European Media, and my reports are aimed at institutional investment professionals (portfolio managers and analysts) who are interested in investing in European media stocks.

RP: You seem to take a much gloomier view about the future of Reed Elsevier than most analysts, and certainly a gloomier one than the company itself. Would you agree? If so why do you think you out of sync with other analysts?

CA: I am probably more pessimistic than other analysts and than investors. I think that, to some extent, analysts and investors tend to extrapolate the future from their observation of the past. "Noise" about changes to the STM publishing industry has surfaced before, and nothing happened: the publishers continued to raise their prices and the academic and research libraries continued to subscribe.

I am increasingly concerned that we are on the cusp of a moment when many things change, and people need time to adjust. The UK press reported yesterday David Cameron's words on decisions that would "affect our economy, our society — indeed our whole way of life"; many companies and people, on the other hand, are still in denial.

RP: In a report you published in March you said, "Reed Elsevier seems in denial on the magnitude of the issue potentially affecting scientific publishing and we would welcome a more thoughtful approach to this issue". Can you say more about the issue you refer to?

CA: If — and I need to emphasize if — the outcome of the budget constraints on academic libraries is a few years of slow or no revenue growth, the publishers will have, at the very least, to take costs out aggressively.

If budget constraints lead to massive cancellations of "big deal" contracts and the offer to sign new contracts at 20/30% lower spending, the publishers will be under severe strain to adapt.

As long as management seems to believe (at least judging from their public statements) that the probability of flat revenues for many years to come is virtually zero, one has to worry about whether there is a Plan B, who is in charge of it and what type of events would trigger it.

Also, the arguments which Elsevier has put forward in the past regarding why OA cannot succeed are unconvincing: for example, when Reed Elsevier argues that OA cannot succeed because of the need for Peer Review, it ignores the fact that most proponents of OA support peer-reviewed dissemination.

RP: Essentially I think you are saying that the company is in denial about Open Access (OA), and the likely impact of OA on its future profitability. Why is OA a threat to Elsevier's future?

CA: OA does not need to be a threat. In fact, I would argue that a shift to Gold OA may be beneficial to Elsevier: if its journals were able to charge submission fees which, in aggregate, are as high as their subscription revenues, they would have same revenues and probably lower costs.

The real threat comes from self-archiving of Peer Reviewed articles [Green OA]. That is why the SCOAP3 model put forward by the High Energy Physics community is so disruptive: it downgrades the role of the publishers to the management of the Peer Review process and perhaps some editorial service. This requires lower spending and effectively negates the value of the impact factor.

Nub of the problem

RP: So what is the nub of the problem Elsevier faces?

CA: That they built a business model predicated on annual revenue increases supported by the steady launch of new titles to justify the increases. It worked well (at least for them) as long as the librarians were able to find the money (regardless of their growing agitation over the increasing spending). It cannot go on if the libraries' budgets continue to be under pressure.

RP: I think you are arguing that the fundamental problem the company faces is that the current global financial crisis means that research libraries will no longer be able to afford to keep paying scholarly publishers like Elsevier the sums they have historically paid them for their journals. In fact, the so-called "serials crisis" is decades old new, and yet libraries have always found the money somehow. Why do you think the situation is different now?

CA: I referred earlier to David Cameron's words. The world is changing, and there will be less money to spend, as well as increased scrutiny on how the money is spent.

For example, will the funders of research (governments, foundations and corporations) continue to allocate funding on the basis of the prestige of the publications on which research is published, or will they start rewarding the impact of individual research?

At the APE Conference in Berlin in January 2010 there were several presentations on article-level impact metrics — it is at least plausible to imagine a world in which the value of the franchise of each individual journal decreases and the value of the franchise of the individual articles increases.

RP: If you are correct, how should Reed Elsevier respond?

CA: It is for management to decide. Ultimately, they will need to balance several factors: shareholders' demand for growing revenues and profits, what competition does, the different needs of the various constituencies within the academic world.

Even within the academic world, different constituencies have different interests: librarians need to manage budgets and free up resources, researchers need to publish in the most prestigious journals, and administrators need to have established mechanisms to evaluate and reward intellectual contributions.

RP: Is there any way in which Elsevier could leverage current interest in OA and turn it to its advantage?

CA: I doubt that shifting to supporting OA would work for them, since they would probably insist on maintaining or increasing profits to do so. In a challenging funding environment, I am not sure that any OA model that Elsevier could support would meet the funding issues of the academic world.

Progressive break-up?

RP: In a more recent report — published last month — you suggested that the company "should pursue a progressive break-up of the company"? Would that not be an overly drastic strategy to adopt? What is your logic for suggesting it?

CA: A progressive divestiture of the assets of Reed Elsevier would effectively lead to what I call a "creeping IPO (Initial Public Offering)" of Elsevier.

In early 2008, the then CEO Crispin Davis singled out for disposal RBI (the business to business magazines division). Had management completed that divestiture in a timely manner, the Exhibitions division would have probably been next.

Since the divestiture of RBI failed, management has started to sell it in smaller pieces, but I would still expect RBI to be eventually divested entirely; at that point Exhibitions would probably follow. LexisNexis (the legal research business), finally, has significant competitive issues in North America.

On top of all this, there are almost no synergies among all these operating units. A progressive divestiture of all these assets, with the proceeds returned to shareholders, would leave Reed Elsevier operating only Elsevier — at that point the market would find a valuation that reflects the prospects of that division.

RP: Do you think that Elsevier is uniquely challenged by the forces at work in the scholarly publishing market today, or are other large publishers like Springer and Wiley-Blackwell equally threatened?

CA: The same issues affect every publisher, and in fact scale could — in theory — help a larger company fund the investments required to be better equipped for the future.

On the other side, smaller companies are often more willing to innovate because the status quo is not as attractive to them.

Disruptive technologies

RP: Could one perhaps argue that the Internet represents a disruptive technology that could trip up Elsevier? In other words, could Elsevier be confronting a similar situation to that faced by the computer company Digital Equipment Corporation (DEC) in the 1990s? DEC was sideswiped by the arrival of the microcomputer and cheap hard drives, and after failing to respond adequately was eventually broken up and sold off.

CA: There is no question that STM journals are probably the legacy of a cost-effective mechanism to disseminate research when print was the only technology available and that, if we could redesign dissemination from the ground up, we would do so differently.

For example, why should a peer-reviewed article approved for publication have to wait until there is a critical mass of articles to be published two or four times a year? Why should it wait even longer because an issue is already "full"? What does "full" mean in an electronic world?

Many academics I talk to argue that they are well acquainted with the research in their field anyway, as the increasing specialization means that they all know each other, meet at congresses, and exchange regularly their thoughts.

More and more people in the academic community argue that the journals provide primarily an informal but necessary qualitative ranking of research. Can this be better accomplished in other ways? More and more people think so.

RP: From your comments I am thinking that you expect the impact factor to be replaced by alternative tools for measuring the quality of research, and at that point the very concept of the scholarly journal will become redundant (with scholarly communication presumably moving to some form of Web 2.0 model)? If that happened, publishers would be left with little to do but manage the peer review process; and since one could expect that to have a very significant impact on their revenues we could presumably anticipate that there might no longer be sufficient incentive for commercial companies to play in the pond?

CA: I am saying that the impact factor could be replaced — I do not have a crystal ball, of course. If the journals were to lose importance and the publishers would be left competing for administrative fees to manage peer review, both revenues and margins would decline.

Commercial companies could still try to tap additional sources of revenue, from search tools to value added services. It is not inconceivable, for example, that primary articles could become OA, but the dissemination and interaction of underlying data sets could be a pay function still provided by the commercial publishers.

RP: In what kind of timescale do you expect to see this all playing out?

CA: Much depends on the severity and the length of this funding crisis. The crisis is now, and the risk of librarians cancelling the "big bundles" would only increase with time.

More radical change, like a transition to OA, would probably take many years, even if it happens at all.

RP: Thank you for your time.

Thursday, June 03, 2010

Interview: The Changing Face of Academic Presses

Excerpt from an interview about Northwestern University Press (NUP) with University Librarian Sarah Pritchard, published in the June issue of Information Today:

Q: Does NUP plan to make any of its books OA?


A: I see a lot of advantages to the selective use of OA in both monographs and journals. However, the question you immediately face is how you get over the hump. For a small press, your backlist is your ongoing bread and butter. So you aim to have at least one big seller on your backlist, probably a textbook. NUP has a couple of big selling textbooks in the field of improvisation and the teaching of drama in classes, for example. These have become staple texts in theater and performance programs.

The problem is that if your backlist is quite profitable and you make it OA, which some people advocate, how do you make up the lost revenue? Or do you just slash your staff?

The truth is that you can't produce books from nothing, even if you are printing them electronically. You still have design, marketing, programming, editorial work, copy editing, and so on. So OA raises a difficult problem for university presses.

Q: The model that many advocate for OA books is making the text freely available online but sell the print version, so that etext will drive print sales. Do you see it as a viable model for NUP?

A: Absolutely, I see that as a very logical model, and I would envisage us moving to that model before we move to a totally OA environment. By the way, we are currently in the process of moving one of our journals to OA, which we are very excited about … TriQuarterly.

... more ...