After thirty years as an academic at the UK's Southampton University, and four years in charge of the UK's e-Science Programme, last year Tony Hey surprised everyone by accepting a post as corporate vice president for technical computing at Microsoft.
Below Hey explains to Richard Poynder why he took the job, and why he believes his decision to do so is good news for the global research community and good news for the Open Access Movement.
Unusual academic
RP: Thank you for making time to speak with me. Let's start with your background? Until joining Microsoft you were an academic who specialised in parallel computing?
TH: Well, I started out as a particle physicist, and I spent 15 years doing particle physics research, and using the whole gamut of UNIX, and tools like LaTeX.
Then in 1985 I switched to computer science and spent 20 years doing computer science in academia.
RP: You were based at the University of Southampton. Can you say more about your research there?
TH: My work was in parallel processing for scientific applications, which when I started was an area that the computer science community had neglected.
RP: You are talking about high performance computing?
TH: Right, or supercomputing: those are the names people use. My research, however, was in practical parallel computing. In the mid 1980s, for instance, I worked on the transputer.
RP: The transputer was a concurrent computing microprocessor developed at INMOS — a company funded by the UK Government right?
TH: Yes. I worked very closely with INMOS and, with other colleagues, was responsible for developing the transputer, which was in some ways ahead of its time. Today, for instance, we are seeing chips being developed that are very similar to the transputer, but appearing many years later. Had INMOS been properly funded I think the UK would have had a significant impact on the computer industry.
RP: INMOS was part of the so-called white heat of technology initiative instigated by the British Labour Prime Minister Harold Wilson in the 1960s wasn't it?
TH: Indeed. The trouble was that when the Conservatives inherited INMOS, they didn't know what to do with it, and sold it off to Thorn EMI. Thorn in its turn didn't understand that it was necessary to invest in the business. So it was a very, very exciting but brief period when the UK seemed to have the courage of its convictions, and just for a minute the country was really competitive in the field.
After the transputer I went into interoperability and portability, and parallel code. One thing I did — with some colleagues — was to write the first draft of the Message Passing Interface [MPI]. This involved a bunch of European and US people meeting every six weeks in Dallas airport hotel, and within a year we had an implementation of MPI that is now an accepted standard around the world.
I was also very keen, by the way, that there should be an Open Source version developed at the same time. So today there are a number of commercial versions of MPI available, plus an Open Source version. There is even a version running on Microsoft products today. I am proud of that.
RP: In total you spent thirty years at Southampton University?
TH: Right, although I had 10 years leave of absence.
RP: And you became head of the School of Electronics and Computer Services department?TH: I did, and then I was dean of engineering. So I span the gamut from physics, to computer science, to engineering. In that sense I am an unusual academic.
e-Science
RP: But your background was clearly ideal for running the UK's e-Science Programme, which you took over in 2001. What is the UK e-Science Programme?
TH: It was the brainchild of John Taylor when he was running Hewlett Packard's research labs in Europe. He had a vision in which computing would be a utility — a pay-as-you-go service similar in concept to the pay-as-you-go mobile phone services available today.
RP: Or the hosting services offered by companies like Google and Amazon (through its S3 service)?
TH: Exactly. And of course Microsoft now offers such services too — services that are delivered in the cloud: You don't care where they are stored, you just use the services.
Anyway, John was later put in charge of the UK Research Councils, and he found himself working with all the physicists, the chemists, the biologists, and the medics, when they were bidding for money from the government. In fact, it was his responsibility to make those bids.
In doing so, he noticed that a lot of researchers from different institutions were collaborating to do their research, often on an international basis. The particle physics community, for instance, is a genuinely international community, and hundreds of different sites all around the world collaborate with one another.
Other research communities — for example the biologists — might want to collaborate with just a few specific sites: an institute in the UK, say, might want to collaborate with an institute in the US, and an institute in Helsinki. So these three sites would collaborate and share their data.
It was in observing this that John developed his idea of e-Science. Then, when I took over the Programme, it became my task to define it.
RP: So how do you define e-Science?
TH: Well, the first point to make is that it's not a science like biology or chemistry. Rather, it is a set of technologies to enable people to collaborate: to share computing, to share data, and to share the use of remote instruments etc. So e-Science is the technologies that allow networked, distributed, collaborative, multi-disciplinary science. It's a very exciting area.
RP: How does e-Science differ from what in the US is called the cyberinfrastructure. Or are we talking about the same thing?
TH: Essentially we are talking about the same thing. In fact I had Paul Messina — who was on the US cyberinfrastructure Blue-Ribbon Advisory Panel — on my steering committee; and John Taylor and I were both interviewed by the Blue-Ribbon Panel. So you will see a lot of e-Science ideas in the US cyberinfrastructure report, and you will see a lot of the US cyberinfrastructure report in what we developed. It just happens that in the US they chose another name. Personally, I think e-Science is a much better name than cyberinfrastructure.
RP: Why?
TH: Because it emphasises science. The purpose isn't to build roads and infrastructure, but to do science. Of course, e-Science depends on the cyberinfrastructure — the networks, the software, and so on, which we in Europe call the e-infrastructure.
But what is wonderful about the e-Science programme is that it has always been application led.
RP: You mean that the emphasis has been on what scientists actually want to do, not the technology?
TH: Exactly. Too often these things are dominated by the technology. And what I really, really liked about the e-Science Programme (and I didn’t set it up that way, John Taylor deserves the credit) is that I was only running about 20% of the budget. That is, I ran the core of the Programme, the part that was needed to underpin all the application projects — and the remaining 80% was application-led.
So it was my responsibility to develop the middleware requirements to support the R&D projects, and the applications themselves were directly funded. This meant that the applications were really great, and that is why the e-Science Programme became so visible around the world.
So I believe we had the right idea. The aim was to do serious science, and to tackle next-generation scientific problems.
RP: Can you give me an example of e-Science in action?
TH: There are many examples. At one end you have particle physics, where physicists need to share their compute clusters to analyse the data that will soon be generated by the LHC machine in CERN, Geneva. At the other end are astronomers who want to share data from different telescopes all over the world ...
####
If you wish to read this interview in its entirety please click on the link below. I am publishing it under a Creative Commons licence, so you are free to copy and distribute it as you wish, so long as you credit me as the author, do not alter or transform the text, and do not use it for any commercial purpose.
If after reading it you feel it is well done you might like to consider making a small contribution to my PayPal account.
I have in mind a figure of $8, but whatever anyone felt inspired to contribute would be fine by me. Payment can be made quite simply by quoting the e-mail account: richard.poynder@btinternet.com. It is not necessary to have a PayPal account to make a payment.
What I would ask is that if you point anyone else to the article then you consider directing them to this post, rather than directly to the PDF file itself.
If you would like to republish the interview on a commercial basis, or have any comments on it, please email me at richard.poynder@btinternet.com.
To read the interview in its entirety (as a PDF file) click here.
Tuesday, December 12, 2006
Wednesday, November 29, 2006
Creating a National Open Access Policy for Developing Countries
Held in India, in the first week of November, the Bangalore workshop on Electronic Publishing and Open Access (OA), was convened in order to agree a model National OA Policy for developing countries. Guest blogger Barbara Kirsop, of the Electronic Publishing Trust for Development, was one of those attending. Below is her report.
Meeting in the idyllic surroundings of the Indian Institute of Science campus, in Bangalore, the 44 participants of the workshop included scientists and OA experts from India, China, Brazil and South Africa, along with colleagues and OA advocates from a number of other countries.
The workshop was hosted by three major Indian scientific institutions — the Indian Institute of Science, the Indian Academy of Science, and the M S Swaminathan Research Foundation — and funded by the Open Society Institute.
The two-day event included thought-provoking presentations on the etymology of such terms as 'open' and 'own', intensive debate about the challenges and opportunities OA raises for developing nations, and a demonstration of the impressive Indo-Chinese Million Books Digital Library project.
With superb Indian refreshments served beneath shady tropical trees, the atmosphere was highly conducive to intensive networking and focussed discussions, and included important updates on existing OA projects, along with statistical evidence showing progress in both OA-advanced countries and developing countries
Moving on
But why was it felt necessary to hold a workshop on OA so soon after the Salvador Declaration on Open Access for Developing Countries, held in September 2005?
The Bangalore workshop was not intended to be a venue simply for confirming acceptance of the principles of OA, but was convened to bring some of the most scientifically advanced developing countries together to report on progress, and consider a model National Open Access Policy that could be offered to governments, and their funding organisations, as a practical tool for driving OA forward.
The aim, therefore, was to take the next step towards ensuring the implementation of earlier OA declarations, not just to talk about OA (the free online availability of peer-reviewed scientific and scholarly journal articles).
What was abundantly clear was that participants agreed on the fundamentals: that academic researchers, in whatever country they work, need access to the published literature in their area of research — for without that they are unable to build on the work of others, gain recognition for their own research, or form professional partnerships.
Grossly uneven
But since the cost of academic journals is prohibitive for many developing countries, scholarly communication is for them severely restricted.
This is a huge problem: A survey conducted by the WHO in 2003, for instance, found that in 75 of the poorest countries, 56% of the medical institutions had been unable to access any journals over the previous five years.
The problem with the current system, therefore, is that the playing field is grossly uneven, and seriously handicaps researchers — who are unable to access all the publications they need to make academic progress because of the high cost of journals.
Furthermore, the cost of printing and distributing local journals means that much developing world research is 'invisible' to the rest of the world, isolating research communities and limiting communication with neighbouring countries.
As a consequence, the incorporation of regional knowledge into international programmes remains minimal. Yet with the growth of global problems — think only of HIV/AIDS, avian 'flu, environmental disasters, climate change or crop failure — it is essential that the countries in which these problems are most commonly experienced have access to research findings, and can contribute their crucial experience to finding global solutions.
Without both improved access and regional visibility, the science base of poorer countries will not be strengthened, and it is well documented that without a strong science base economies remain weak and dependent on others.
But thanks to the profound media developments made possible by the Web, OA has created exhilarating new opportunities for the exchange of essential research information. And while this promises huge benefits for all academic research, it will be especially beneficial for developing nations, by providing equality of access for all.
It is clear, for instance, that wherever researchers have embraced OA, the visibility, quality and the impact of local research has flourished, and subscriptions to OA journals have even increased — a clear indication that researchers were previously information-starved.
Progress has been made
Yet while awareness of OA in many developing countries remains low, the more scientifically advanced nations have already recognised the benefits of OA, and are making fast progress — both in converting journals to OA and establishing interoperable institutional repositories
Progress has also been made in developing nations, and workshop delegates were updated on local developments. In India, for instance, the MedKnow project in Mumbai, has done much to raise the visibility of Indian medical journals in a sustainable way, and without charging authors or readers.
Meanwhile, the Bioline service has recorded impressive increases in requests for full-text papers from the developing country journals it hosts, with a projected one million requests in 2006 for papers that would otherwise be largely unknown and unavailable to local researchers.
This kind of progress highlights the amount of research information that was totally unused pre-OA, due to its inaccessibility.
Successful strategies for filling institutional repositories were also discussed at the workshop, with examples taken not only from the developed regions, but from local research institutes too. One Indian institute, for instance, is 'gently persuading' its scholars to deposit their articles by refusing travel support to those that do not archive their publications!
Further examples were given of OA progress in China and South Africa, as well as from the established SciELO programme in Brazil — all of which confirmed the growth and value of Open Access policies.
Vigorous debate
It was agreed, however, that progress could be significantly speeded up if a model National OA Policy could be drawn up, and developing countries encouraged to adopt it.
It was also felt that this would be particularly effective if it was formally accepted by a group of local experts — of which there was no shortage at the workshop — who know and understand the problems faced by developing countries on the ground.
While there was vigorous debate on how to encourage adoption of such a policy, there was no dissent over the need for it, or of its basic form.
Specifically, the draft Policy document urges governments to require copies of all publicly funded research published in peer-reviewed journals to be deposited in an institutional digital repository as soon as publication is accepted, and encourages government grant holders to provide Open Access to their deposited papers immediately upon deposit. Grant holders are also encouraged to publish in a suitable Open Access journal, where one exists. This should be a condition of research funding for any papers partly or fully funded by the government.
Delegates decided to allow a further two weeks for local consultation about the final wording of the Policy, and submission of further improvements — after which the document would be made widely available.
Next logical step
While the development of any new academic practice always generates wide debate, the Bangalore workshop was grounded on the principle that OA is the next logical step in the evolutionary process of scholarly communication.
The ethos of those attending was summed up by a comment from Lawrence Laing, of the Alternative Law Forum. "Why," he asked, "are top scientists said to be gifted?" "Because", he replied, "they give their research findings to others."
It was also evident to participants that it was important to produce a new tool for ensuring continued progress; a tool targeted clearly, but not exclusively, at developing countries. It was agreed therefore that a model National OA Policy would prove to be a valuable advancement.
Clearly, the success of the Policy will depend on whether the relevant governments, funders and research institutes adopt its recommendations.
But it may be that the countries represented at the Bangalore workshop will lead the way in the adoption of Open Access policies. After all, they have most to gain and so much to contribute.
All presentations, lists of participants and the draft model National OA Policy document are available on the workshop web site.
Barbara Kirsop can be contacted on: ept@biostrat.demon.co.uk.
Meeting in the idyllic surroundings of the Indian Institute of Science campus, in Bangalore, the 44 participants of the workshop included scientists and OA experts from India, China, Brazil and South Africa, along with colleagues and OA advocates from a number of other countries.
The workshop was hosted by three major Indian scientific institutions — the Indian Institute of Science, the Indian Academy of Science, and the M S Swaminathan Research Foundation — and funded by the Open Society Institute.
The two-day event included thought-provoking presentations on the etymology of such terms as 'open' and 'own', intensive debate about the challenges and opportunities OA raises for developing nations, and a demonstration of the impressive Indo-Chinese Million Books Digital Library project.
With superb Indian refreshments served beneath shady tropical trees, the atmosphere was highly conducive to intensive networking and focussed discussions, and included important updates on existing OA projects, along with statistical evidence showing progress in both OA-advanced countries and developing countries
Moving on
But why was it felt necessary to hold a workshop on OA so soon after the Salvador Declaration on Open Access for Developing Countries, held in September 2005?
The Bangalore workshop was not intended to be a venue simply for confirming acceptance of the principles of OA, but was convened to bring some of the most scientifically advanced developing countries together to report on progress, and consider a model National Open Access Policy that could be offered to governments, and their funding organisations, as a practical tool for driving OA forward.
The aim, therefore, was to take the next step towards ensuring the implementation of earlier OA declarations, not just to talk about OA (the free online availability of peer-reviewed scientific and scholarly journal articles).
What was abundantly clear was that participants agreed on the fundamentals: that academic researchers, in whatever country they work, need access to the published literature in their area of research — for without that they are unable to build on the work of others, gain recognition for their own research, or form professional partnerships.
Grossly uneven
But since the cost of academic journals is prohibitive for many developing countries, scholarly communication is for them severely restricted.
This is a huge problem: A survey conducted by the WHO in 2003, for instance, found that in 75 of the poorest countries, 56% of the medical institutions had been unable to access any journals over the previous five years.
The problem with the current system, therefore, is that the playing field is grossly uneven, and seriously handicaps researchers — who are unable to access all the publications they need to make academic progress because of the high cost of journals.
Furthermore, the cost of printing and distributing local journals means that much developing world research is 'invisible' to the rest of the world, isolating research communities and limiting communication with neighbouring countries.
As a consequence, the incorporation of regional knowledge into international programmes remains minimal. Yet with the growth of global problems — think only of HIV/AIDS, avian 'flu, environmental disasters, climate change or crop failure — it is essential that the countries in which these problems are most commonly experienced have access to research findings, and can contribute their crucial experience to finding global solutions.
Without both improved access and regional visibility, the science base of poorer countries will not be strengthened, and it is well documented that without a strong science base economies remain weak and dependent on others.
But thanks to the profound media developments made possible by the Web, OA has created exhilarating new opportunities for the exchange of essential research information. And while this promises huge benefits for all academic research, it will be especially beneficial for developing nations, by providing equality of access for all.
It is clear, for instance, that wherever researchers have embraced OA, the visibility, quality and the impact of local research has flourished, and subscriptions to OA journals have even increased — a clear indication that researchers were previously information-starved.
Progress has been made
Yet while awareness of OA in many developing countries remains low, the more scientifically advanced nations have already recognised the benefits of OA, and are making fast progress — both in converting journals to OA and establishing interoperable institutional repositories
Progress has also been made in developing nations, and workshop delegates were updated on local developments. In India, for instance, the MedKnow project in Mumbai, has done much to raise the visibility of Indian medical journals in a sustainable way, and without charging authors or readers.
Meanwhile, the Bioline service has recorded impressive increases in requests for full-text papers from the developing country journals it hosts, with a projected one million requests in 2006 for papers that would otherwise be largely unknown and unavailable to local researchers.
This kind of progress highlights the amount of research information that was totally unused pre-OA, due to its inaccessibility.
Successful strategies for filling institutional repositories were also discussed at the workshop, with examples taken not only from the developed regions, but from local research institutes too. One Indian institute, for instance, is 'gently persuading' its scholars to deposit their articles by refusing travel support to those that do not archive their publications!
Further examples were given of OA progress in China and South Africa, as well as from the established SciELO programme in Brazil — all of which confirmed the growth and value of Open Access policies.
Vigorous debate
It was agreed, however, that progress could be significantly speeded up if a model National OA Policy could be drawn up, and developing countries encouraged to adopt it.
It was also felt that this would be particularly effective if it was formally accepted by a group of local experts — of which there was no shortage at the workshop — who know and understand the problems faced by developing countries on the ground.
While there was vigorous debate on how to encourage adoption of such a policy, there was no dissent over the need for it, or of its basic form.
Specifically, the draft Policy document urges governments to require copies of all publicly funded research published in peer-reviewed journals to be deposited in an institutional digital repository as soon as publication is accepted, and encourages government grant holders to provide Open Access to their deposited papers immediately upon deposit. Grant holders are also encouraged to publish in a suitable Open Access journal, where one exists. This should be a condition of research funding for any papers partly or fully funded by the government.
Delegates decided to allow a further two weeks for local consultation about the final wording of the Policy, and submission of further improvements — after which the document would be made widely available.
Next logical step
While the development of any new academic practice always generates wide debate, the Bangalore workshop was grounded on the principle that OA is the next logical step in the evolutionary process of scholarly communication.
The ethos of those attending was summed up by a comment from Lawrence Laing, of the Alternative Law Forum. "Why," he asked, "are top scientists said to be gifted?" "Because", he replied, "they give their research findings to others."
It was also evident to participants that it was important to produce a new tool for ensuring continued progress; a tool targeted clearly, but not exclusively, at developing countries. It was agreed therefore that a model National OA Policy would prove to be a valuable advancement.
Clearly, the success of the Policy will depend on whether the relevant governments, funders and research institutes adopt its recommendations.
But it may be that the countries represented at the Bangalore workshop will lead the way in the adoption of Open Access policies. After all, they have most to gain and so much to contribute.
All presentations, lists of participants and the draft model National OA Policy document are available on the workshop web site.
Barbara Kirsop can be contacted on: ept@biostrat.demon.co.uk.
Monday, November 20, 2006
Open Access: Beyond Selfish Interests
Few would question that the aim of the Open Access (OA) Movement — to make all research papers freely available on the Web — is a laudable one. OA will considerably benefit the research process, and maximise the use of public funds. It was encouraging therefore to see the topic of OA aired in a number of presentations at the recent Internet Librarian International (ILI). Listening to them, however, I found myself wondering how many acts of selfishness stand between us and OA.
There is today no shortage of discussion about OA. A simple search for the term on Google demonstrates that. Most of these discussions, however, tend to take place amongst the various warring factions of the OA Movement, particularly researchers and librarians, plus the occasional scholarly publisher brave enough to put his head above the parapet.
I was interested, therefore, to hear at ILI the opinions of someone with a less partisan view; the view, moreover, of an economist. For the conference keynote was given by Danny Quah, professor of economics at the prestigious London School of Economics. True, Quah is himself a researcher, but it was clear that it was as an economist that he spoke. He also brought a welcome international perspective to the debate.
Quah had been asked by conference chair Marydee Ojala to give a paper on the weightless economy (a term Quah coined) and "the knowledge glut". In the event, Ojala commented on her blog, he didn't talk about either of these, but "the economics of publishing."
However, Ojala may have missed the point. Speaking as an economist, Quah set out to ask a very important question about the so-called knowledge economy. That is, why, despite the glut of information in the world today, is some information increasing in price? He clearly also wanted to draw to the attention of his librarian audience the important role they play in this counterintuitive development.
Maximum social good
Modern economists, explained Quah, work on the assumption that in free markets the greatest social good is achieved when supply and demand are in equilibrium. To help achieve this, he added, brokers, or intermediaries, constantly work to match customers with suppliers. In doing so, intermediaries "perform a useful function — one that increases welfare for society — and they earn the resulting appropriate rewards."
Indeed, he added, the extraordinary thing about free markets is that maximum social good is arrived at through a process in which everyone acts selfishly. "[P]eople do good for society by doing well for themselves", he said, not least the intermediaries, whose sole aim is that of "single-mindedly, just narrowly trying to get suppliers and demanders to meet."
He added: "It is far from the brokers' minds, and rightly so, that in merely doing their job a greater social good might be attained."
Nevertheless, he concluded, the model is not foolproof. When social institutions are created or destroyed, for instance, the opposite effect can occur. And when this happens the system may not automatically be able to repair itself.
That, he argued, is precisely what has happened in the scholarly journal market. Where economists would expect an increase in supply to have caused prices to decrease, the rapid growth in published research we have seen over the past several decades has led to an increase in the price of scholarly journals.
Even more puzzling, he added, this has occurred during a period of unprecedented advances in the technology for distributing information. As he put it, "In the face of arguably the greatest improvement in information dissemination technology in pretty much all of recorded history … academic journal prices have not fallen but actually increased."
And journal prices have increased in an extraordinary manner. By 2000, Quah said, the average annual subscription of science and technology journals had reached $1,200, having increased by 80% over the previous decade. In the case of biomedical journals, he added, the market experienced a more than doubling of prices in the seven years after 1994.
An important feature of this price inflation, added Quah, is the difference in price between journals produced by commercial publishers, and those produced by non-profit publishers.
So where, for instance, in 2001 the top 10 most-cited economics journals produced by commercial publishers had an average annual subscription rate of $1,370, the cost of the top 10 most-cited economics journals produced by non-profit publishers was just $190.
How could it be, asked Quah, that the market is able to tolerate a 600% mark-up like this?
And in case anyone accuse him of trying to compare apples and oranges, he added, we should note that the same disparity is evident when costs are calculated on a price-per-page basis: thus where the cost of a page produced by non-profit publishers averaged 18 cents, the figure for commercial publishers was 82 cents.
Likewise, when calculated on a per-citation-basis the average non-profit price was 15 cents per citation, compared to $2.40 in the case of commercial publishers.
"Evidently," concluded Quah, "the premium that the marketplace willingly pays commercial publishers remains high — between 5 and 16 times according to these back-of-the-envelope calculations."
Peculiar economic commodity
So why has a glut in scholarly information led not to a fall in price but to an increase? Because, answered Quah, "Information is a peculiar economic commodity. It does not trade easily or conveniently in conventional markets."
As a consequence, he added, "the social institution that has emerged historically to allow information exchange, production, and dissemination is an intellectual property rights (IPR) system."
The problem with the IPR system, however, is that while it is a good way of assigning priority, and according proprietary rights in new information and ideas, it is not an effective pricing mechanism, said Quah.
In short, when the IPR system is used for pricing, rather than assigning priority and ownership, it causes problems — because while ordinary property rights foster market competition, intellectual property rights create and sanction monopolies. As such, rather than facilitating market competition, IPRs stifle it.
The market outcome, said Quah, is one in which price is separated from cost, "and the price mark-up over cost turns out to be whatever the marketplace will bear."
Add to this the fact that the market for scholarly journals is inelastic, he said, and it becomes apparent why there is no market control on the price of scholarly journals.
While Quah did not directly state it, his point was surely that the dysfunction in the market for scholarly journals is a direct result of publishers insisting that, as a condition of publication, researchers have to sign over copyright in their papers, thereby giving publishers an exclusive right to distribute them (at whatever price they want).
Certainly, Quah's analysis touched a raw nerve for some in his audience. Conscious that ILI organiser Information Today is itself also a commercial publisher, the company's president Tom Hogan suggested that Quah's figures did not take into account the fact that learned societies are able to subsidise their publishing activities through membership subscriptions.
"As I said during my talk," Quah replied cannily, "my aim is not to point fingers."
In short, Quah's thesis appeared to be that in a free market diverse selfish actions are aggregated in such a way as to maximise public good — an economic theory first enunciated by Adam Smith in his 1776 book The Wealth of Nations. Smith devised the metaphor of the "Invisible Hand", which economists today frequently use to characterise the way in which in a free market multiple selfish acts can lead to outcomes that have beneficial consequences not just for those individual actors, but for society at large.
In the case of the information market, Quah argued, the IPR system has destroyed the benign social function that the invisible hand can have.
Quah's talk spurred me to think about the various actors in the OA drama, and their different motivations. Might we, I wondered, reach a better understanding of the problems besetting the scholarly journal market if we considered the motivations of the different actors, and the selfish actions that drive the market? Could this also alert us to the dangers ahead, and help us see what needs to be done?
##
If you wish to read this article in its entirety please click on the link below. I am publishing it under a Creative Commons licence, so you are free to copy and distribute it as you wish, so long as you credit me as the author, do not alter or transform the text, and do not use it for any commercial purpose.
Please note that while I am making the article freely available to all, I am a freelance journalist by profession, and so make my living from writing. To assist me to continue making my work available in this way I invite anyone who reads the article to make a voluntary contribution to my PayPal account.
I have in mind a figure of $8, but whatever anyone felt inspired to contribute would be fine by me. Payment can be made quite simply by quoting the e-mail account: richard.poynder@btinternet.com. It is not necessary to have a PayPal account to make a payment.
What I would ask is that if you point anyone else to the article then you consider directing them to this post, rather than directly to the PDF file itself.If you would like to republish the article on a commercial basis, or have any comments on it, please email me at richard.poynder@journalist.co.uk.
To read the article in its entirety (as a PDF file) click here.
The full text of Danny Quah's paper, and his blog entry on ILI can be found here
There is today no shortage of discussion about OA. A simple search for the term on Google demonstrates that. Most of these discussions, however, tend to take place amongst the various warring factions of the OA Movement, particularly researchers and librarians, plus the occasional scholarly publisher brave enough to put his head above the parapet.
I was interested, therefore, to hear at ILI the opinions of someone with a less partisan view; the view, moreover, of an economist. For the conference keynote was given by Danny Quah, professor of economics at the prestigious London School of Economics. True, Quah is himself a researcher, but it was clear that it was as an economist that he spoke. He also brought a welcome international perspective to the debate.
Quah had been asked by conference chair Marydee Ojala to give a paper on the weightless economy (a term Quah coined) and "the knowledge glut". In the event, Ojala commented on her blog, he didn't talk about either of these, but "the economics of publishing."
However, Ojala may have missed the point. Speaking as an economist, Quah set out to ask a very important question about the so-called knowledge economy. That is, why, despite the glut of information in the world today, is some information increasing in price? He clearly also wanted to draw to the attention of his librarian audience the important role they play in this counterintuitive development.
Maximum social good
Modern economists, explained Quah, work on the assumption that in free markets the greatest social good is achieved when supply and demand are in equilibrium. To help achieve this, he added, brokers, or intermediaries, constantly work to match customers with suppliers. In doing so, intermediaries "perform a useful function — one that increases welfare for society — and they earn the resulting appropriate rewards."
Indeed, he added, the extraordinary thing about free markets is that maximum social good is arrived at through a process in which everyone acts selfishly. "[P]eople do good for society by doing well for themselves", he said, not least the intermediaries, whose sole aim is that of "single-mindedly, just narrowly trying to get suppliers and demanders to meet."
He added: "It is far from the brokers' minds, and rightly so, that in merely doing their job a greater social good might be attained."
Nevertheless, he concluded, the model is not foolproof. When social institutions are created or destroyed, for instance, the opposite effect can occur. And when this happens the system may not automatically be able to repair itself.
That, he argued, is precisely what has happened in the scholarly journal market. Where economists would expect an increase in supply to have caused prices to decrease, the rapid growth in published research we have seen over the past several decades has led to an increase in the price of scholarly journals.
Even more puzzling, he added, this has occurred during a period of unprecedented advances in the technology for distributing information. As he put it, "In the face of arguably the greatest improvement in information dissemination technology in pretty much all of recorded history … academic journal prices have not fallen but actually increased."
And journal prices have increased in an extraordinary manner. By 2000, Quah said, the average annual subscription of science and technology journals had reached $1,200, having increased by 80% over the previous decade. In the case of biomedical journals, he added, the market experienced a more than doubling of prices in the seven years after 1994.
An important feature of this price inflation, added Quah, is the difference in price between journals produced by commercial publishers, and those produced by non-profit publishers.
So where, for instance, in 2001 the top 10 most-cited economics journals produced by commercial publishers had an average annual subscription rate of $1,370, the cost of the top 10 most-cited economics journals produced by non-profit publishers was just $190.
How could it be, asked Quah, that the market is able to tolerate a 600% mark-up like this?
And in case anyone accuse him of trying to compare apples and oranges, he added, we should note that the same disparity is evident when costs are calculated on a price-per-page basis: thus where the cost of a page produced by non-profit publishers averaged 18 cents, the figure for commercial publishers was 82 cents.
Likewise, when calculated on a per-citation-basis the average non-profit price was 15 cents per citation, compared to $2.40 in the case of commercial publishers.
"Evidently," concluded Quah, "the premium that the marketplace willingly pays commercial publishers remains high — between 5 and 16 times according to these back-of-the-envelope calculations."
Peculiar economic commodity
So why has a glut in scholarly information led not to a fall in price but to an increase? Because, answered Quah, "Information is a peculiar economic commodity. It does not trade easily or conveniently in conventional markets."
As a consequence, he added, "the social institution that has emerged historically to allow information exchange, production, and dissemination is an intellectual property rights (IPR) system."
The problem with the IPR system, however, is that while it is a good way of assigning priority, and according proprietary rights in new information and ideas, it is not an effective pricing mechanism, said Quah.
In short, when the IPR system is used for pricing, rather than assigning priority and ownership, it causes problems — because while ordinary property rights foster market competition, intellectual property rights create and sanction monopolies. As such, rather than facilitating market competition, IPRs stifle it.
The market outcome, said Quah, is one in which price is separated from cost, "and the price mark-up over cost turns out to be whatever the marketplace will bear."
Add to this the fact that the market for scholarly journals is inelastic, he said, and it becomes apparent why there is no market control on the price of scholarly journals.
While Quah did not directly state it, his point was surely that the dysfunction in the market for scholarly journals is a direct result of publishers insisting that, as a condition of publication, researchers have to sign over copyright in their papers, thereby giving publishers an exclusive right to distribute them (at whatever price they want).
Certainly, Quah's analysis touched a raw nerve for some in his audience. Conscious that ILI organiser Information Today is itself also a commercial publisher, the company's president Tom Hogan suggested that Quah's figures did not take into account the fact that learned societies are able to subsidise their publishing activities through membership subscriptions.
"As I said during my talk," Quah replied cannily, "my aim is not to point fingers."
In short, Quah's thesis appeared to be that in a free market diverse selfish actions are aggregated in such a way as to maximise public good — an economic theory first enunciated by Adam Smith in his 1776 book The Wealth of Nations. Smith devised the metaphor of the "Invisible Hand", which economists today frequently use to characterise the way in which in a free market multiple selfish acts can lead to outcomes that have beneficial consequences not just for those individual actors, but for society at large.
In the case of the information market, Quah argued, the IPR system has destroyed the benign social function that the invisible hand can have.
Quah's talk spurred me to think about the various actors in the OA drama, and their different motivations. Might we, I wondered, reach a better understanding of the problems besetting the scholarly journal market if we considered the motivations of the different actors, and the selfish actions that drive the market? Could this also alert us to the dangers ahead, and help us see what needs to be done?
##
If you wish to read this article in its entirety please click on the link below. I am publishing it under a Creative Commons licence, so you are free to copy and distribute it as you wish, so long as you credit me as the author, do not alter or transform the text, and do not use it for any commercial purpose.
Please note that while I am making the article freely available to all, I am a freelance journalist by profession, and so make my living from writing. To assist me to continue making my work available in this way I invite anyone who reads the article to make a voluntary contribution to my PayPal account.
I have in mind a figure of $8, but whatever anyone felt inspired to contribute would be fine by me. Payment can be made quite simply by quoting the e-mail account: richard.poynder@btinternet.com. It is not necessary to have a PayPal account to make a payment.
What I would ask is that if you point anyone else to the article then you consider directing them to this post, rather than directly to the PDF file itself.If you would like to republish the article on a commercial basis, or have any comments on it, please email me at richard.poynder@journalist.co.uk.
To read the article in its entirety (as a PDF file) click here.
The full text of Danny Quah's paper, and his blog entry on ILI can be found here
Sunday, October 15, 2006
Open Access: death knell for peer review?
A frequent criticism of Open Access (OA) is that it will lead to the traditional peer review process being abandoned, with scientific papers simply thrown on to the Web without being subjected to any quality control or independent assessment. Is this likely? If so, would it matter?
The argument that OA threatens peer review is most often made by scientific publishers. They do so, argue OA advocates, not out of any genuine concern, but in the hope that by alarming people they can ward off the growing calls for research funders to introduce mandates requiring that all the research they fund is made freely available on the Internet.
Their real motive, critics add, is simply to protect the substantial profits that they make from scientific publishing.
Whatever the truth, there is no doubt that STM publishers are currently very keen to derail initiatives like the US Federal Research Public Access Act (FRPAA) — legislation that, if introduced, would require all US Government agencies with annual extramural research expenditures of over $100 million to make manuscripts of journal articles stemming from research they have funded publicly available on the Internet.
And a primary reason publishers give for their opposition to such initiatives is that it would "adversely impact the existing peer review system."
Factually inaccurate
What do publishers mean by peer review? Wikipedia describes it as the process of "subjecting an author's scholarly work or ideas to the scrutiny of others who are experts in the field. It is used primarily by publishers, to select and to screen submitted manuscripts, and by funding agencies, to decide the awarding of monies for research. The peer review process is aimed at getting authors to meet the standards of their discipline and of science generally."
In other words, when a researcher submits a paper for publication in a scholarly journal the editor will ask a number of other researchers in the field whether the submitted paper warrants publication.
And Open Access is the process of making scholarly papers freely available on the Internet.
There are two ways of achieving OA: Researchers can publish their paper in an OA journal like PLoS Biology which, rather than charging readers (or their institutions) a subscription to read the contents of the journal, charges authors (or invariably authors' funders) to publish the paper.
Alternatively, researchers can publish in a traditional subscription-based journal like Nature or Science and then make their paper freely available on the Web by self-archiving it in their institutional repository (IR).
Since both methods still require that papers are peer reviewed, OA advocates point out, publisher claims that making research OA necessitates foregoing the peer review process is factually inaccurate.
And while it is true that some researchers also post their preprints in IRs prior to having them peer reviewed, they add, this is done solely in order to make their research available more quickly, not to avoid peer review. As OA advocate Stevan Harnad frequently points out, OA means "free online access to peer-reviewed research (after — and sometimes before — peer review), not to research free of peer review."
There is, however, a second strand to publishers' claims that OA threatens peer review. If OA is forced on them, they say, they will not be able to survive financially, either because they will discover that there is no stable long-term business model for OA publishing, or because the increasing number of papers researchers post in institutional repositories will cause academic institutions to cancel their journal subscriptions. This poses a threat to peer review, they add, since if publishers exit the market there will be no one left to manage the process.
However, these claims are also rejected by OA advocates, who argue that most publishers have already accommodated themselves to self-archiving. Indeed, they add, there is no indication at all that self-archiving negatively impacts journal subscriptions. Nor is there any reason, they say, to believe that a sustainable OA business model cannot be found.
Far from perfect
But supposing publishers are right, and OA does eventually cause peer review to be abandoned? Would it matter?
After all, while researchers and publishers say they place great store on it, peer review is far from perfect. In September 2000, for instance, the UK Parliamentary Office of Science and Technology (OST) pointed out that many view peer review as "an inherently conservative process … [that] … encourages the emergence of self-serving cliques of reviewers, who are more likely to review each others' grant proposals and publications favourably than those submitted by researchers from outside the group."
Publishers have also been known to concede that there are problems with peer review. Writing in 1997, for instance, the then editor of the British Medical Journal (BMJ), Richard Smith, described peer review as "expensive, slow, prone to bias, open to abuse, possibly anti-innovatory, and unable to detect fraud." He added: "We also know that the published papers that emerge from the process are often grossly deficient."
In fact, it seems that the most that can be said of peer review is that we have failed to come up with anything better. Following the decision by Science to retract papers it had published by Dr Hwang Woo-suk — after it was discovered that he had faked claims that he had obtained stem cells from cloned human embryos — for instance, publications consultant Liz Wager said of peer review "it's a lousy system but it's the best one we have."
True, there have been some attempts to improve things. One frequent criticism of peer review, for instance, is that since the scientists who review submitted papers do so anonymously there is little accountability — no doubt assisting "self-serving cliques of reviewers" to arise.
For this reason a few more enlightened publishers have tried to make the process more transparent. In 1999, for instance, the BMJ announced that it would in future "identify to authors the names of those who have reviewed their papers, including the names of our in house editorial and statistical advisers."
And in June 2006, Nature also appeared to have accepted that greater openness would help, announcing an experiment in which some of the papers submitted to the journal would be subjected both to the traditional confidential peer review process, while also being placed on the Web for open, identifiable public comment.
The aim, Nature said, is to "test the quality of unsolicited comments made during an open peer review process, against the traditional process where no unsolicited comments [are made]".
Too little too late?
Nature's experiment is not a sign that the journal is contemplating abandoning peer review — it is simply exploring what is generally called "open review." We should add that open review is not the same things as Open Access, since Nature remains a traditional journal based on a subscription-based publishing model (although it does permit authors to self-archive papers that they have published in the journal six-month's after publication).
But are traditional journals like Nature in danger of doing too little too late, and being left behind? For in recent months OA journals have also been exploring new ways of evaluating scientific papers. And they are taking a far more radical approach.
In August, for instance, OA publisher Public Library of Science launched a new journal called PLoS ONE. Papers submitted to PLoS ONE still undergo peer review, but a less rigorous form of peer review. Once published, however, the papers are also subjected to post-publication review on the Web.
As the managing editor of PLoS ONE Chris Surridge explained to me when I interviewed him earlier this year, PLoS ONE referees are asked to answer a simpler question than that asked by traditional peer review. That question is: "Has the science in this paper been done well enough to warrant it being entered into the scientific literature as a whole?"
The key point, added Surridge, is that PLoS ONE does not believe that peer review should end on publication. "Every paper will have a discussion thread attached to it. We are also developing ways to allow people to directly annotate the papers themselves."
What PLoS ONE and Nature have in common, of course, is that they are both experimenting with new types of peer review. However, the PLoS ONE model differs from Nature's in a number of significant ways. First, it will utilise a less rigorous form of peer review prior to publication. Second, the open review part of the process will take place not in parallel with the traditional review stage, but after publication. Third, once published the papers will be OA. As such, they will not be locked behind a financial firewall, available only to subscribers.
It is worth making that last point because Nature's ability to experiment with peer review is surely constrained by the limited accessibility of its papers.
For this reason alone, Nature's approach will inevitably have to be more cautious than an OA journal like PLoS ONE can adopt, even though it clearly acknowledges that profound questions need to be asked about traditional peer review — as evidenced by the online debate Nature has been holding on the topic.
Root and branch
As a consequence, while traditional publishers are having to continue playing up the importance of the current peer review system while tinkering at the edges, OA journals are are able to take a root and branch approach to reform.
On the other hand, however, this leaves the OA Movement vulnerable to allegations from those who oppose initiatives like the FRPAA that it is bent on destroying the scientific process.
Yet the more the failings of traditional peer review are examined, the more radical seem to be the solutions that people feel are necessary. In September, for instance, a group of UK academics keen to improve the way in which scientific research is evaluated launched a new OA journal called Philica.
Unlike both Nature and PLoS ONE, Philica has no editors, and papers are published immediately on submission — without even a cursory review process. Instead, the entire evaluation process takes place after publication, with reviews displayed at the end of each paper.
As such, the aim of the review process is not to decide whether or not to publish a paper, but to provide potential readers with guidance on its importance and quality, and so enabling particularly popular or unpopular works to be easily identified.
Importantly, argues Philica, its approach means that reviewers cannot suppress ideas if they disagree with them.
Specifically, the evaluation process consists of anonymous, recursively weighted reviews. As the journal's FAQ explains, "When people review Philica entries, they rate them on three criteria: originality, importance and overall quality. These ratings are on a scale of 1-7, where 1 is very poor, 7 is very good and 4, the mid-point, represents a rating of middling quality."
Moreover, unlike PLoS ONE, Philica does not charge author-side fees. This is possible, the FAQ explains, because the overheads are minimal. "Philica can be made free to everybody, whilst retaining the benefits of peer-review, because of the open, online submission and refereeing processes."
Philica is not the only new initiative to push the envelope that bit further. Another approach similar in spirit is that adopted by Naboj, which utilises what it calls a dynamical peer review system.
Modelled on the review system of Amazon, Naboj allows users to evaluate both the articles themselves, and the reviews of those articles. The theory is that with a sufficient number of users and reviewers, a convergence process will occur in which a better quality review system emerges.
Currently Naboj users can only review preprints that have been posted in the popular physics preprint server arXiv.org, but there are plans to extend this to allow reviews of papers in other open archives, including presumably the burgeoning number of institutional repositories.
As with Philica, the aim is not to assess whether papers should be published, but to guide people on the quality of the growing number of scholarly papers being made available online, regardless of whether they have been peer reviewed in the traditional sense.
Do these new services herald the end of peer review? Not necessarily. Philica, for instance, makes a point of stressing that it only accepts reviews submitted by bona fide academics, excluding even graduate students from the process — on the grounds that it "is not normal practice for students to do this to the work of fully-qualified academics, and we do not consider it desirable to change that here."
So while the review process may have migrated from the pre-publication phase to the post-publication phase, it is still peer review.
We should perhaps also stress that Naboj does not claim to be an OA journal, but a web site to help people find useful research. As such it is more of an adjunct to the growing tide of OA literature than an alternative scholarly journal.
Moreover, the vast majority of OA journals — most of those for instance published by the two main OA publishers BioMed Central and Public Library of Science — still practice traditional peer review before publishing papers.
Nevertheless the implications of new initiatives like PLoS ONE and Philica are surely clear.
Complicating an already confused picture
The problem for OA advocates, of course, is that such developments are complicating an already confused picture, and leading to a great deal of misunderstanding about the relationship between OA and peer review.
The consequences of this were amply demonstrated at the beginning of October, when an Associated Press news story about PLoS ONE and Philica was published. As is usual with AP stories, the article was syndicated to multiple newspapers — and with every republication the headlines became increasingly alarmist.
The Desert Sun, for instance, reprinted the article with the headline and subtitle: "Online journals threaten scientific review system: Internet sites publishing studies with little or no scrutiny by peers"; The Gainesville Sun, published it with the headline, "Online publishing a threat to peer review"; and The Monterey Herald went with: "Academic journals bypass peers, go to Web."
"Is this routine editorial carelessness or spreading paranoia?" asked OA advocate Peter Suber exasperatedly on his blog.
The answer, perhaps, is a bit of both. Certainly, the spreading confusion is a boon to publishers bent on killing the various proposals intended to make OA mandatory — since it is causing many to conclude that OA represents a serious threat to the scientific process.
The paranoia reached a peak when the AP article attracted the attention of Harvard's college newspaper, The Harvard Crimson, which responded by publishing a muddle-headed editorial called "Keep Science in Print".
The launch of journals like PLoS ONE, it warned, threatens to "create a morass from which science might not emerge. Results will be duplicated, communication retarded, and progress slowed to a standstill … [as] … scientists will have no way of knowing which discoveries and experiments merit their time and interest. Instead they will spend inordinate amounts of time wading through the quicksand of junk science to get to truly interesting work."
Unfortunately, pointed out Suber, The Harvard Crimson editorial was seriously flawed, failing on at least two important counts. "First, it confuses open review with non-review. Second, it assumes that all online-only journals (open access and subscription-based) use open review — i.e. that traditional peer review requires print."
Inevitable
For those impatient to see OA prevail, the spreading confusion is very frustrating. What OA advocates therefore need to do, suggests Harnad, is insist on keeping discussions about reforming peer review separate from the debate about OA.
So while agreeing that peer review "can be made more effective and efficient", Harnad insists that any discussion about reforming it "should not be mixed up with — or allowed to get in the way of — OA, which is free access to the peer-reviewed literature we have, such as it is."
Conflation, however, seems inevitable — not just because the public is confused, but because, as Suber recently pointed out, "there are a lot of exciting synergies to explore between OA and different models of peer review." A good example of the way in which these are being explored at the Los Alamos National Laboratory, he added, was mentioned by Herbert Van de Sompel during the Nature debate on peer review.
These synergies flow from the fact that when papers are made freely available on the Web so much more is possible. And for this reason, some believe that peer review reform and OA are joined at the hip.
This was a point made by Andrew Odlyzko, a mathematician who heads the University of Minnesota's Digital Technology Centre, in a recent response to Harnad on the American Scientist Open Access Forum. "I think you go too far by denying that Open Access has anything to do with peer review," he said. "For many (including myself), Open Access is (among other things) a facilitator of the evolution towards an improved peer review system."
In this light it is not accidental that OA publishers are beginning to lead the way in peer review reform.
In any case, as the editor-in-chief of Wired magazine Chris Anderson has pointed out, it seems inevitable that the Internet will change the way in which peer review is conducted, not least because where in a print world scholarly papers have to jostle for limited space (pages), in an online environment any such constraints go away.
So where the decision in classical peer review is whether a paper is worthy enough to earn a share of a limited resource, in the online world no such decision is necessary, and the admission gates can be thrown open.
Filter and rank
In an online world, by contrast, the issue is not whether a piece of research gets to go through the gates, but how you filter and rank the expanding tide of research passing through. (But note that this implies that there are no access restrictions imposed on that research).
After all, the Internet will inevitable fill up with junk science regardless of the peer review system. The priority, therefore, will increasingly be to highlight research worthy of attention, and to flag the junk, not whether a paper is published or not.
The important point to bear in mind, says Odlyzko, is that peer review has never been the final edict on a work. "It guarantees neither correctness, nor importance, nor originality. All it does is provide some partially quantifiable assurance of those. The final verdict comes decades (and sometimes centuries) later, when scholars go back and reassess individual contributions."
But perhaps the most interesting question raised by current developments is not when and how science is evaluated, but who does it. As Anderson reminds us, on the Internet a new kind of peer review is emerging — one in which "peer is coming to mean everyman more than professional of equal rank." When people talk about peer-to-peer services, for instance, they are not usually referring to services designed to enable scientists to talk to one another.
Can we therefore assume that the work of a scientist will always be evaluated exclusively by fellow scientists? And would it matter if it were not?
Anderson points out, for instance, that Wikipedia contributors "don't have to have PhDs or any sort of professional affiliation: their contributions are considered on their merit, regardless of who they are or how they have become knowledgeable." Likewise they can delete or edit the contributions made by experts
Tellingly, a controversial study conducted by Nature in 2005 concluded that the accuracy level of science entries in Wikipedia was only slight lower than that fount of all expert wisdom, the Encyclopaedia Britannica.
And those who believe that the intellectual elite should only ever be evaluated by their coevals will surely have been depressed by the implications of the ill-informed and self-congratulatory editorial penned by The Harvard Crimson. Even the intellectual elite, it seems, can talk nonsense at times, and it doesn't take a scientist to spot it.
Given the obvious inaccuracies parroted by the Harvard scribblers, peer review traditionalists will also doubtless feel uncomfortable about the parallel the editorial drew with the evaluation of scholarly literature. "Getting into Harvard is hard, very hard," it boasted. "Yearly the gatekeepers in Byerly Hall vet thousands of applicants on their merits, rejecting many times the number of students that they accept. But getting a scientific paper published in Science or Nature, today’s pre-eminent scientific journals, is oftentimes harder."
As one of those who left a comment on the web site of The Harvard Crimson put it (evidently sceptical about the merit systems at play here), "Since most at Harvard are well-connected and can get their papers published no matter what their merit, perhaps there is anxiety about a purely merit-based system?"
Goodbye gatekeeper?
So has OA sounded a death knell for traditional peer review? Perhaps. Would it matter if it has? Probably not. In fact, OA journals seem to be in the process of developing superior ways of evaluating papers, not doing away with evaluation.
A more accurate way of describing developments, perhaps, is that peer review as understood and practised by traditional journals is giving way to a number of new models. These models are more appropriate to the online world, and perhaps offer a more effective and efficient way of evaluating science. Importantly, they require that scholarly papers are made freely available on the Web — which is precisely the aim of the OA Movement.
What the debate also points to, of course, is that in an Internet-enabled world traditional gatekeepers cannot assume that their role will remain the same.
Indeed, as New York University journalism professor, Jay Rosen pointed out when I interviewed him earlier this year, many traditional gates are in the process of being dismantled. As a consequence, he added, "All kinds of knowledge monopolies — and positions of authority based on them — are wearing away."
In other words, those professionals and organisations that have gained control of society's institutions and power structures are going to have to justify the rights and privileges they currently enjoy, or lose them.
Undoubtedly this includes science publishers who believe that they should retain the exclusive right to determine how scientific papers are published; peer review cliques who think they should be the sole arbiters of what science is and is not published; and, indeed, journalists who believe that, in a world full of blogs, they can maintain a monopoly on deciding what gets reported .
Who knows, perhaps it is only a matter of time before the gatekeepers at Byerly Hall, along with their Ivy League colleagues, discover that their pre-eminent right to adjudicate on academic merit has also disappeared
The argument that OA threatens peer review is most often made by scientific publishers. They do so, argue OA advocates, not out of any genuine concern, but in the hope that by alarming people they can ward off the growing calls for research funders to introduce mandates requiring that all the research they fund is made freely available on the Internet.
Their real motive, critics add, is simply to protect the substantial profits that they make from scientific publishing.
Whatever the truth, there is no doubt that STM publishers are currently very keen to derail initiatives like the US Federal Research Public Access Act (FRPAA) — legislation that, if introduced, would require all US Government agencies with annual extramural research expenditures of over $100 million to make manuscripts of journal articles stemming from research they have funded publicly available on the Internet.
And a primary reason publishers give for their opposition to such initiatives is that it would "adversely impact the existing peer review system."
Factually inaccurate
What do publishers mean by peer review? Wikipedia describes it as the process of "subjecting an author's scholarly work or ideas to the scrutiny of others who are experts in the field. It is used primarily by publishers, to select and to screen submitted manuscripts, and by funding agencies, to decide the awarding of monies for research. The peer review process is aimed at getting authors to meet the standards of their discipline and of science generally."
In other words, when a researcher submits a paper for publication in a scholarly journal the editor will ask a number of other researchers in the field whether the submitted paper warrants publication.
And Open Access is the process of making scholarly papers freely available on the Internet.
There are two ways of achieving OA: Researchers can publish their paper in an OA journal like PLoS Biology which, rather than charging readers (or their institutions) a subscription to read the contents of the journal, charges authors (or invariably authors' funders) to publish the paper.
Alternatively, researchers can publish in a traditional subscription-based journal like Nature or Science and then make their paper freely available on the Web by self-archiving it in their institutional repository (IR).
Since both methods still require that papers are peer reviewed, OA advocates point out, publisher claims that making research OA necessitates foregoing the peer review process is factually inaccurate.
And while it is true that some researchers also post their preprints in IRs prior to having them peer reviewed, they add, this is done solely in order to make their research available more quickly, not to avoid peer review. As OA advocate Stevan Harnad frequently points out, OA means "free online access to peer-reviewed research (after — and sometimes before — peer review), not to research free of peer review."
There is, however, a second strand to publishers' claims that OA threatens peer review. If OA is forced on them, they say, they will not be able to survive financially, either because they will discover that there is no stable long-term business model for OA publishing, or because the increasing number of papers researchers post in institutional repositories will cause academic institutions to cancel their journal subscriptions. This poses a threat to peer review, they add, since if publishers exit the market there will be no one left to manage the process.
However, these claims are also rejected by OA advocates, who argue that most publishers have already accommodated themselves to self-archiving. Indeed, they add, there is no indication at all that self-archiving negatively impacts journal subscriptions. Nor is there any reason, they say, to believe that a sustainable OA business model cannot be found.
Far from perfect
But supposing publishers are right, and OA does eventually cause peer review to be abandoned? Would it matter?
After all, while researchers and publishers say they place great store on it, peer review is far from perfect. In September 2000, for instance, the UK Parliamentary Office of Science and Technology (OST) pointed out that many view peer review as "an inherently conservative process … [that] … encourages the emergence of self-serving cliques of reviewers, who are more likely to review each others' grant proposals and publications favourably than those submitted by researchers from outside the group."
Publishers have also been known to concede that there are problems with peer review. Writing in 1997, for instance, the then editor of the British Medical Journal (BMJ), Richard Smith, described peer review as "expensive, slow, prone to bias, open to abuse, possibly anti-innovatory, and unable to detect fraud." He added: "We also know that the published papers that emerge from the process are often grossly deficient."
In fact, it seems that the most that can be said of peer review is that we have failed to come up with anything better. Following the decision by Science to retract papers it had published by Dr Hwang Woo-suk — after it was discovered that he had faked claims that he had obtained stem cells from cloned human embryos — for instance, publications consultant Liz Wager said of peer review "it's a lousy system but it's the best one we have."
True, there have been some attempts to improve things. One frequent criticism of peer review, for instance, is that since the scientists who review submitted papers do so anonymously there is little accountability — no doubt assisting "self-serving cliques of reviewers" to arise.
For this reason a few more enlightened publishers have tried to make the process more transparent. In 1999, for instance, the BMJ announced that it would in future "identify to authors the names of those who have reviewed their papers, including the names of our in house editorial and statistical advisers."
And in June 2006, Nature also appeared to have accepted that greater openness would help, announcing an experiment in which some of the papers submitted to the journal would be subjected both to the traditional confidential peer review process, while also being placed on the Web for open, identifiable public comment.
The aim, Nature said, is to "test the quality of unsolicited comments made during an open peer review process, against the traditional process where no unsolicited comments [are made]".
Too little too late?
Nature's experiment is not a sign that the journal is contemplating abandoning peer review — it is simply exploring what is generally called "open review." We should add that open review is not the same things as Open Access, since Nature remains a traditional journal based on a subscription-based publishing model (although it does permit authors to self-archive papers that they have published in the journal six-month's after publication).
But are traditional journals like Nature in danger of doing too little too late, and being left behind? For in recent months OA journals have also been exploring new ways of evaluating scientific papers. And they are taking a far more radical approach.
In August, for instance, OA publisher Public Library of Science launched a new journal called PLoS ONE. Papers submitted to PLoS ONE still undergo peer review, but a less rigorous form of peer review. Once published, however, the papers are also subjected to post-publication review on the Web.
As the managing editor of PLoS ONE Chris Surridge explained to me when I interviewed him earlier this year, PLoS ONE referees are asked to answer a simpler question than that asked by traditional peer review. That question is: "Has the science in this paper been done well enough to warrant it being entered into the scientific literature as a whole?"
The key point, added Surridge, is that PLoS ONE does not believe that peer review should end on publication. "Every paper will have a discussion thread attached to it. We are also developing ways to allow people to directly annotate the papers themselves."
What PLoS ONE and Nature have in common, of course, is that they are both experimenting with new types of peer review. However, the PLoS ONE model differs from Nature's in a number of significant ways. First, it will utilise a less rigorous form of peer review prior to publication. Second, the open review part of the process will take place not in parallel with the traditional review stage, but after publication. Third, once published the papers will be OA. As such, they will not be locked behind a financial firewall, available only to subscribers.
It is worth making that last point because Nature's ability to experiment with peer review is surely constrained by the limited accessibility of its papers.
For this reason alone, Nature's approach will inevitably have to be more cautious than an OA journal like PLoS ONE can adopt, even though it clearly acknowledges that profound questions need to be asked about traditional peer review — as evidenced by the online debate Nature has been holding on the topic.
Root and branch
As a consequence, while traditional publishers are having to continue playing up the importance of the current peer review system while tinkering at the edges, OA journals are are able to take a root and branch approach to reform.
On the other hand, however, this leaves the OA Movement vulnerable to allegations from those who oppose initiatives like the FRPAA that it is bent on destroying the scientific process.
Yet the more the failings of traditional peer review are examined, the more radical seem to be the solutions that people feel are necessary. In September, for instance, a group of UK academics keen to improve the way in which scientific research is evaluated launched a new OA journal called Philica.
Unlike both Nature and PLoS ONE, Philica has no editors, and papers are published immediately on submission — without even a cursory review process. Instead, the entire evaluation process takes place after publication, with reviews displayed at the end of each paper.
As such, the aim of the review process is not to decide whether or not to publish a paper, but to provide potential readers with guidance on its importance and quality, and so enabling particularly popular or unpopular works to be easily identified.
Importantly, argues Philica, its approach means that reviewers cannot suppress ideas if they disagree with them.
Specifically, the evaluation process consists of anonymous, recursively weighted reviews. As the journal's FAQ explains, "When people review Philica entries, they rate them on three criteria: originality, importance and overall quality. These ratings are on a scale of 1-7, where 1 is very poor, 7 is very good and 4, the mid-point, represents a rating of middling quality."
Moreover, unlike PLoS ONE, Philica does not charge author-side fees. This is possible, the FAQ explains, because the overheads are minimal. "Philica can be made free to everybody, whilst retaining the benefits of peer-review, because of the open, online submission and refereeing processes."
Philica is not the only new initiative to push the envelope that bit further. Another approach similar in spirit is that adopted by Naboj, which utilises what it calls a dynamical peer review system.
Modelled on the review system of Amazon, Naboj allows users to evaluate both the articles themselves, and the reviews of those articles. The theory is that with a sufficient number of users and reviewers, a convergence process will occur in which a better quality review system emerges.
Currently Naboj users can only review preprints that have been posted in the popular physics preprint server arXiv.org, but there are plans to extend this to allow reviews of papers in other open archives, including presumably the burgeoning number of institutional repositories.
As with Philica, the aim is not to assess whether papers should be published, but to guide people on the quality of the growing number of scholarly papers being made available online, regardless of whether they have been peer reviewed in the traditional sense.
Do these new services herald the end of peer review? Not necessarily. Philica, for instance, makes a point of stressing that it only accepts reviews submitted by bona fide academics, excluding even graduate students from the process — on the grounds that it "is not normal practice for students to do this to the work of fully-qualified academics, and we do not consider it desirable to change that here."
So while the review process may have migrated from the pre-publication phase to the post-publication phase, it is still peer review.
We should perhaps also stress that Naboj does not claim to be an OA journal, but a web site to help people find useful research. As such it is more of an adjunct to the growing tide of OA literature than an alternative scholarly journal.
Moreover, the vast majority of OA journals — most of those for instance published by the two main OA publishers BioMed Central and Public Library of Science — still practice traditional peer review before publishing papers.
Nevertheless the implications of new initiatives like PLoS ONE and Philica are surely clear.
Complicating an already confused picture
The problem for OA advocates, of course, is that such developments are complicating an already confused picture, and leading to a great deal of misunderstanding about the relationship between OA and peer review.
The consequences of this were amply demonstrated at the beginning of October, when an Associated Press news story about PLoS ONE and Philica was published. As is usual with AP stories, the article was syndicated to multiple newspapers — and with every republication the headlines became increasingly alarmist.
The Desert Sun, for instance, reprinted the article with the headline and subtitle: "Online journals threaten scientific review system: Internet sites publishing studies with little or no scrutiny by peers"; The Gainesville Sun, published it with the headline, "Online publishing a threat to peer review"; and The Monterey Herald went with: "Academic journals bypass peers, go to Web."
"Is this routine editorial carelessness or spreading paranoia?" asked OA advocate Peter Suber exasperatedly on his blog.
The answer, perhaps, is a bit of both. Certainly, the spreading confusion is a boon to publishers bent on killing the various proposals intended to make OA mandatory — since it is causing many to conclude that OA represents a serious threat to the scientific process.
The paranoia reached a peak when the AP article attracted the attention of Harvard's college newspaper, The Harvard Crimson, which responded by publishing a muddle-headed editorial called "Keep Science in Print".
The launch of journals like PLoS ONE, it warned, threatens to "create a morass from which science might not emerge. Results will be duplicated, communication retarded, and progress slowed to a standstill … [as] … scientists will have no way of knowing which discoveries and experiments merit their time and interest. Instead they will spend inordinate amounts of time wading through the quicksand of junk science to get to truly interesting work."
Unfortunately, pointed out Suber, The Harvard Crimson editorial was seriously flawed, failing on at least two important counts. "First, it confuses open review with non-review. Second, it assumes that all online-only journals (open access and subscription-based) use open review — i.e. that traditional peer review requires print."
Inevitable
For those impatient to see OA prevail, the spreading confusion is very frustrating. What OA advocates therefore need to do, suggests Harnad, is insist on keeping discussions about reforming peer review separate from the debate about OA.
So while agreeing that peer review "can be made more effective and efficient", Harnad insists that any discussion about reforming it "should not be mixed up with — or allowed to get in the way of — OA, which is free access to the peer-reviewed literature we have, such as it is."
Conflation, however, seems inevitable — not just because the public is confused, but because, as Suber recently pointed out, "there are a lot of exciting synergies to explore between OA and different models of peer review." A good example of the way in which these are being explored at the Los Alamos National Laboratory, he added, was mentioned by Herbert Van de Sompel during the Nature debate on peer review.
These synergies flow from the fact that when papers are made freely available on the Web so much more is possible. And for this reason, some believe that peer review reform and OA are joined at the hip.
This was a point made by Andrew Odlyzko, a mathematician who heads the University of Minnesota's Digital Technology Centre, in a recent response to Harnad on the American Scientist Open Access Forum. "I think you go too far by denying that Open Access has anything to do with peer review," he said. "For many (including myself), Open Access is (among other things) a facilitator of the evolution towards an improved peer review system."
In this light it is not accidental that OA publishers are beginning to lead the way in peer review reform.
In any case, as the editor-in-chief of Wired magazine Chris Anderson has pointed out, it seems inevitable that the Internet will change the way in which peer review is conducted, not least because where in a print world scholarly papers have to jostle for limited space (pages), in an online environment any such constraints go away.
So where the decision in classical peer review is whether a paper is worthy enough to earn a share of a limited resource, in the online world no such decision is necessary, and the admission gates can be thrown open.
Filter and rank
In an online world, by contrast, the issue is not whether a piece of research gets to go through the gates, but how you filter and rank the expanding tide of research passing through. (But note that this implies that there are no access restrictions imposed on that research).
After all, the Internet will inevitable fill up with junk science regardless of the peer review system. The priority, therefore, will increasingly be to highlight research worthy of attention, and to flag the junk, not whether a paper is published or not.
The important point to bear in mind, says Odlyzko, is that peer review has never been the final edict on a work. "It guarantees neither correctness, nor importance, nor originality. All it does is provide some partially quantifiable assurance of those. The final verdict comes decades (and sometimes centuries) later, when scholars go back and reassess individual contributions."
But perhaps the most interesting question raised by current developments is not when and how science is evaluated, but who does it. As Anderson reminds us, on the Internet a new kind of peer review is emerging — one in which "peer is coming to mean everyman more than professional of equal rank." When people talk about peer-to-peer services, for instance, they are not usually referring to services designed to enable scientists to talk to one another.
Can we therefore assume that the work of a scientist will always be evaluated exclusively by fellow scientists? And would it matter if it were not?
Anderson points out, for instance, that Wikipedia contributors "don't have to have PhDs or any sort of professional affiliation: their contributions are considered on their merit, regardless of who they are or how they have become knowledgeable." Likewise they can delete or edit the contributions made by experts
Tellingly, a controversial study conducted by Nature in 2005 concluded that the accuracy level of science entries in Wikipedia was only slight lower than that fount of all expert wisdom, the Encyclopaedia Britannica.
And those who believe that the intellectual elite should only ever be evaluated by their coevals will surely have been depressed by the implications of the ill-informed and self-congratulatory editorial penned by The Harvard Crimson. Even the intellectual elite, it seems, can talk nonsense at times, and it doesn't take a scientist to spot it.
Given the obvious inaccuracies parroted by the Harvard scribblers, peer review traditionalists will also doubtless feel uncomfortable about the parallel the editorial drew with the evaluation of scholarly literature. "Getting into Harvard is hard, very hard," it boasted. "Yearly the gatekeepers in Byerly Hall vet thousands of applicants on their merits, rejecting many times the number of students that they accept. But getting a scientific paper published in Science or Nature, today’s pre-eminent scientific journals, is oftentimes harder."
As one of those who left a comment on the web site of The Harvard Crimson put it (evidently sceptical about the merit systems at play here), "Since most at Harvard are well-connected and can get their papers published no matter what their merit, perhaps there is anxiety about a purely merit-based system?"
Goodbye gatekeeper?
So has OA sounded a death knell for traditional peer review? Perhaps. Would it matter if it has? Probably not. In fact, OA journals seem to be in the process of developing superior ways of evaluating papers, not doing away with evaluation.
A more accurate way of describing developments, perhaps, is that peer review as understood and practised by traditional journals is giving way to a number of new models. These models are more appropriate to the online world, and perhaps offer a more effective and efficient way of evaluating science. Importantly, they require that scholarly papers are made freely available on the Web — which is precisely the aim of the OA Movement.
What the debate also points to, of course, is that in an Internet-enabled world traditional gatekeepers cannot assume that their role will remain the same.
Indeed, as New York University journalism professor, Jay Rosen pointed out when I interviewed him earlier this year, many traditional gates are in the process of being dismantled. As a consequence, he added, "All kinds of knowledge monopolies — and positions of authority based on them — are wearing away."
In other words, those professionals and organisations that have gained control of society's institutions and power structures are going to have to justify the rights and privileges they currently enjoy, or lose them.
Undoubtedly this includes science publishers who believe that they should retain the exclusive right to determine how scientific papers are published; peer review cliques who think they should be the sole arbiters of what science is and is not published; and, indeed, journalists who believe that, in a world full of blogs, they can maintain a monopoly on deciding what gets reported .
Who knows, perhaps it is only a matter of time before the gatekeepers at Byerly Hall, along with their Ivy League colleagues, discover that their pre-eminent right to adjudicate on academic merit has also disappeared
Wednesday, September 27, 2006
Shining a light
Yesterday's announcement by IBM that it is establishing a new patent policy to promote innovation is further evidence — were such evidence needed — of how broken the patent system has become.
The four tenets of the new policy are that patent applicants should be responsible for the quality and clarity of their patent applications; that patent applications should be available for public examination; that patent ownership should be transparent and easily discernible; and that pure business methods without technical merit should not be patentable.
While IBM indicated ways in which it will be introducing the new policy within the company itself, it also hopes that by publishing the policy it will provide a model that other companies will follow.
Commenting on the announcement Dr John Kelly III, IBM senior vice president for Technology and Intellectual Property, said: "IBM is holding itself to a higher standard than any law requires because it's urgent that patent quality is improved, to both stimulate innovation and provide greater clarity for the protection and enforcement of intellectual property rights."
In other words, the patent system is now so cracked that even large multinational companies like IBM are starting to conclude that the problems it introduces may end up outweighing the benefits it provides.
As The New York Times pointed out, "Rapid advances in technology and the rise of industries like software, biotechnology and nanotechnology have resulted in a steep increase in patent applications in recent years. With limited resources, the United States Patent and Trademark Office has been overwhelmed."
And it is the same story in all the major patent offices.
The paper added, "The avalanche of patents — many making broad and vague claims — has produced an environment of uncertainty, rich in opportunity for litigation and patent speculators."
On the face of it, the most noteworthy aspect of IBM's new policy would seem to be that it has committed itself to opening up its patent applications to community review. Since this would provide competitors with intelligence about its R&D activities there would be be some risk attached to doing so. In reality, however, it appears that the company is only committing to open up some of its patent applications for review.
It is also noteworthy that IBM is proposing that only business methods that have obvious "technical merit" should be patented. Where such inventions do not have this, it suggests, companies would be better to place details of them into the public domain — a practice widely known as "defensive publishing". As the IBM press release puts it, "Applicants should seek to publish, not patent, their pure business method innovations if they wish to prevent others from patenting similar business methods."
The company also announced that it will make available to the public over 100 of its business method patents, and that its technical experts "will spend thousands of hours annually reviewing published patent applications submitted to patent offices."
The announcement is not the first time that IBM has taken a lead in introducing greater openness to the development of technology. In June 1998, for instance, the company took the revolutionary decision to sell and support the Open Source web server Apache as part of its WebSphere suite — a move widely viewed as having given the Open Source Movement a significant boost at a critical moment.
And last year the company gave a pledge [PDF] not to assert 500 of its patents against Open Source software developers.
However, it would be wrong to assume that IBM has any intention of abandoning its traditional proprietary approach to business. The company remains the largest US patent holder and, according to The New York Times, was granted 2,974 patents in the US last year alone. Moreover, as the paper pointed out, IBM also remains committed to the patenting of software, a practice universally condemned by Open Source software developers, and widely held to pose a serious threat to future innovation in software.
All IBM wants is for more light to be shone on the arcane and secret world of patenting, in the hope that greater transparency will make the system more effective, and lead to a significant reduction in the number of unwarranted patents that are issued.
Moreover, the way in which the story was initially presented to the world leaves one suspicious that there is a fair amount of spin in the announcement.
But if IBM does succeed in its aims it should at least be able to reduce its legal bills, and indeed the legal bills of many companies. As IBM's chief executive Samuel Palmisano commented in The New York Times, "The larger picture here is that intellectual property is the crucial capital in a global knowledge economy. If you need a dozen lawyers involved every time you want to do something, it's going to be a huge barrier. We need to make sure that intellectual property is not used as a barrier to growth in the future."
Critics will no doubt point out that what IBM is proposing was in any case always a given of the patent system, but has in recent years simply been lost sight of. They will also argue that, even if IBM's proposals are widely adopted, patents will continue to be a barrier to growth, and future innovation — at least until much greater changes are introduced than the company is calling for.
The four tenets of the new policy are that patent applicants should be responsible for the quality and clarity of their patent applications; that patent applications should be available for public examination; that patent ownership should be transparent and easily discernible; and that pure business methods without technical merit should not be patentable.
While IBM indicated ways in which it will be introducing the new policy within the company itself, it also hopes that by publishing the policy it will provide a model that other companies will follow.
Commenting on the announcement Dr John Kelly III, IBM senior vice president for Technology and Intellectual Property, said: "IBM is holding itself to a higher standard than any law requires because it's urgent that patent quality is improved, to both stimulate innovation and provide greater clarity for the protection and enforcement of intellectual property rights."
In other words, the patent system is now so cracked that even large multinational companies like IBM are starting to conclude that the problems it introduces may end up outweighing the benefits it provides.
As The New York Times pointed out, "Rapid advances in technology and the rise of industries like software, biotechnology and nanotechnology have resulted in a steep increase in patent applications in recent years. With limited resources, the United States Patent and Trademark Office has been overwhelmed."
And it is the same story in all the major patent offices.
The paper added, "The avalanche of patents — many making broad and vague claims — has produced an environment of uncertainty, rich in opportunity for litigation and patent speculators."
On the face of it, the most noteworthy aspect of IBM's new policy would seem to be that it has committed itself to opening up its patent applications to community review. Since this would provide competitors with intelligence about its R&D activities there would be be some risk attached to doing so. In reality, however, it appears that the company is only committing to open up some of its patent applications for review.
It is also noteworthy that IBM is proposing that only business methods that have obvious "technical merit" should be patented. Where such inventions do not have this, it suggests, companies would be better to place details of them into the public domain — a practice widely known as "defensive publishing". As the IBM press release puts it, "Applicants should seek to publish, not patent, their pure business method innovations if they wish to prevent others from patenting similar business methods."
The company also announced that it will make available to the public over 100 of its business method patents, and that its technical experts "will spend thousands of hours annually reviewing published patent applications submitted to patent offices."
The announcement is not the first time that IBM has taken a lead in introducing greater openness to the development of technology. In June 1998, for instance, the company took the revolutionary decision to sell and support the Open Source web server Apache as part of its WebSphere suite — a move widely viewed as having given the Open Source Movement a significant boost at a critical moment.
And last year the company gave a pledge [PDF] not to assert 500 of its patents against Open Source software developers.
However, it would be wrong to assume that IBM has any intention of abandoning its traditional proprietary approach to business. The company remains the largest US patent holder and, according to The New York Times, was granted 2,974 patents in the US last year alone. Moreover, as the paper pointed out, IBM also remains committed to the patenting of software, a practice universally condemned by Open Source software developers, and widely held to pose a serious threat to future innovation in software.
All IBM wants is for more light to be shone on the arcane and secret world of patenting, in the hope that greater transparency will make the system more effective, and lead to a significant reduction in the number of unwarranted patents that are issued.
Moreover, the way in which the story was initially presented to the world leaves one suspicious that there is a fair amount of spin in the announcement.
But if IBM does succeed in its aims it should at least be able to reduce its legal bills, and indeed the legal bills of many companies. As IBM's chief executive Samuel Palmisano commented in The New York Times, "The larger picture here is that intellectual property is the crucial capital in a global knowledge economy. If you need a dozen lawyers involved every time you want to do something, it's going to be a huge barrier. We need to make sure that intellectual property is not used as a barrier to growth in the future."
Critics will no doubt point out that what IBM is proposing was in any case always a given of the patent system, but has in recent years simply been lost sight of. They will also argue that, even if IBM's proposals are widely adopted, patents will continue to be a barrier to growth, and future innovation — at least until much greater changes are introduced than the company is calling for.
Tuesday, September 26, 2006
New Assignment
Slashdot is this week interviewing NYU professor Jay Rosen, a long-time proponent of civic journalism.
Rosen recently started NewAssignment.net using seed money from craigslist founder Craig Newmark, a $10,000 grant from the Sunlight Foundation and, it was announced last week, $100,000 from Reuters.
What is New Assignment? Essentially, says Rosen, it is "a way to fund high-quality, original reporting, in any medium, through donations to a non-profit called NewAssignment.Net."
What is of particular interest is that New Assignment will use Open Source methods to undertake the reporting.
What do Open Source methods mean in this context? They mean having professional journalists and what Rosen calls "the smart mobs" work together on reporting assignments. "Rather than proclaim one over the other," he explains, the aim is to utilise the advantages of both.The reporting will also take place in an open and transparent way.
In other words, professional journalists will work openly and co-operatively with ordinary citizens to produce news reports. As Rosen puts it, "The site gives out real assignments — paid gigs with a chance to practice the craft of reporting at a high level. Because they’re getting paid, the journalists who contract with New Assignment have the time — and obligation — to do things well. That means working with the smart mobs who gave rise to the assignment and handed it over to an editor and correspondent with the story part-of-the-way there."
So the starting point for assignments will be the Web and the smart mobs. Then if deemed worthy, at some point a story will be adopted by New Assignment and developed by journalists. Says Rosen: "The correspondent doesn't 'take over' until work is well along. The early stages are done in the open. The money for the reporting isn't raised until the story is outlined and partially developed. Which means you can see where it's going. Don't like it? Don’t contribute! The evidence for why there's a story there can be examined by anyone who is interested. When you go to the site itself, www.newassignment.net, assignments are in motion, from bubbling up to rolling out, sort of like projects at a studio."
The plan is to run a test project this year, with a view to going live in 2007.
I posted an extensive Q&A interview with Rosen on the topic of Open Source Journalism in March. And it is gratifying to note that amongst the six background items that Slashdot has recommended people read before posing questions for Rosen — including articles published by The Washington Post, USA Today, The Economist, and PBS — is the interview that Rosen did with me.
If you want to join the discussion on Slashdot then click this link!
Rosen recently started NewAssignment.net using seed money from craigslist founder Craig Newmark, a $10,000 grant from the Sunlight Foundation and, it was announced last week, $100,000 from Reuters.
What is New Assignment? Essentially, says Rosen, it is "a way to fund high-quality, original reporting, in any medium, through donations to a non-profit called NewAssignment.Net."
What is of particular interest is that New Assignment will use Open Source methods to undertake the reporting.
What do Open Source methods mean in this context? They mean having professional journalists and what Rosen calls "the smart mobs" work together on reporting assignments. "Rather than proclaim one over the other," he explains, the aim is to utilise the advantages of both.The reporting will also take place in an open and transparent way.
In other words, professional journalists will work openly and co-operatively with ordinary citizens to produce news reports. As Rosen puts it, "The site gives out real assignments — paid gigs with a chance to practice the craft of reporting at a high level. Because they’re getting paid, the journalists who contract with New Assignment have the time — and obligation — to do things well. That means working with the smart mobs who gave rise to the assignment and handed it over to an editor and correspondent with the story part-of-the-way there."
So the starting point for assignments will be the Web and the smart mobs. Then if deemed worthy, at some point a story will be adopted by New Assignment and developed by journalists. Says Rosen: "The correspondent doesn't 'take over' until work is well along. The early stages are done in the open. The money for the reporting isn't raised until the story is outlined and partially developed. Which means you can see where it's going. Don't like it? Don’t contribute! The evidence for why there's a story there can be examined by anyone who is interested. When you go to the site itself, www.newassignment.net, assignments are in motion, from bubbling up to rolling out, sort of like projects at a studio."
The plan is to run a test project this year, with a view to going live in 2007.
I posted an extensive Q&A interview with Rosen on the topic of Open Source Journalism in March. And it is gratifying to note that amongst the six background items that Slashdot has recommended people read before posing questions for Rosen — including articles published by The Washington Post, USA Today, The Economist, and PBS — is the interview that Rosen did with me.
If you want to join the discussion on Slashdot then click this link!
Friday, September 22, 2006
Interview with Richard Jefferson
Today I am publishing an interview with Richard Jefferson, founder and CEO of CAMBIA, and advocate for the Biological Open Source Movement. This is number nine of The Basement Interviews.
The first part of the introduction is being published here on my blog. The interview itself, including the full introduction, is available as a downloadable PDF file (see below for details). The interview is being published under a Creative Commons licence.
The Basement Interviews
Biological Open Source
Richard Jefferson, founder and CEO of CAMBIA, and leading light of the Biological Open Source Movement, talks to Richard Poynder
Richard Jefferson was born in California in 1956, the son of music promoter and producer Carl Jefferson. His mother, Hermeline, was a stage actress turned librarian.
Jefferson's parents divorced before he was born, so he and his two siblings lived with their mother in a single-parent household. As they were "financially challenged", says Jefferson, all the children had to pull their weight, and there were few treats. "I worked as a 4 am to 8 am paperboy most of my childhood. We never even had a family holiday — not one — and almost all my clothes were from 'goodwill' or Salvation Army."
After a brief period in what he refers to as "a pretty feeble Catholic elementary school", Jefferson entered the public school system, where he was put into a programme for "mentally gifted minors"
Although good at biology at school, Jefferson was more excited by physics and physical chemistry. Physics, he explains, offered "an underlying method in which you can distil the fundamental principles of life." By contrast, biology was just "a lot of cool observational stuff."
The best undergraduate
Jefferson's attitude to biology changed in 1974, however, when he went to the University of California in Santa Barbara (UCSB), and was exposed to what he calls "hardcore molecular biology." Specifically, one of the first lectures he attended was given by molecular biologist John Carbon, who talked about the research he had been doing on recombinant DNA during a recent sabbatical at Stanford University.
As he listened, Jefferson realised that Carbon was describing the same kind of "core unifying logic" that had thrilled him in his school physics, but had until now been absent from biology. Increasingly excited at what he was hearing, Jefferson began firing questions at Carbon, and the lecture turned into a two-way conversation, with the other students gazing on with glazed eyes.
The incident was sufficiently singular that Carbon recalls it vividly. As he explained to me by email: "I remember I talked about recent developments in recombinant DNA research (this was when that field was in its infancy). Rick Jefferson — as he was then known — asked several questions during the lecture, and then afterwards came up to talk with me, and to ask more questions. He was very excited about the research I described."
Jefferson was so enthralled, in fact that he immediately embarked on a campaign to persuade Carbon to let him work in his lab — an unheard of privilege for an undergraduate.
Eventually, says Carbon, "I invited him to help out, even though I had never taken a first year student into the lab previously. At first he helped out post-doctorals and grad students, but eventually he became more independent; more like a grad student when he was a senior. And he worked there until he graduated."
This was at the dawn of molecular biology, and Carbon's lab was one of only three or four labs in the world then working in the field.
Determined to learn as much as possible about molecular biology, Jefferson then began pestering the University of Edinburgh, in Scotland, to let him spend a year of his undergraduate studies in the lab of Ken Murray. "I managed — by perseverance — to push my way into a lab in a culture where undergrads just don't do that," says Jefferson. "And I spent a wonderful and instructive undergraduate year there when research on recombinant DNA was just beginning in Europe."
After returning to Santa Barbara and completing his degree, Jefferson moved to Boulder, Colorado, to do a PhD in the lab of David Hirsh.
It wasn’t long before Jefferson made an important contribution to molecular biology himself, developing a gene reporter system called GUS. This was revolutionary because it allowed molecular biologists for the first time to monitor exactly what was going on when they were trying to implant foreign genes into an organism. As Jefferson explains, before he developed GUS "people were just chucking stuff into blenders without any idea of what was happening!" As we shall, see GUS was later to become a key tool in the armoury of molecular biology.
At Boulder, Jefferson acquired a reputation for being a talented but somewhat maverick scientist. As the then head of department Bill Wood told me by email, "Richard was a flamboyant and impressive grad student here, who also drove us crazy at times."
Jefferson's view is that his colleagues simply didn’t understand his obsession with methodology, and so didn't respect or appreciate what he was doing. "Everything in science is determined by the tools that fall into scientists' hands," he says. "And GUS is an example of really good methodology; it's a great tool."
In other words, scientists can be as brilliant as they like, but without the right tools they are limited in what they can achieve. Yet most researchers remain focused exclusively on the sexy business of pushing back the frontiers of knowledge, not on the quotidian task of creating the tools to enable cutting edge discoveries to be made. "Methodology has always been really dissed in science, and yet it is so important," complains Jefferson.
By the time he finished at Colorado, Jefferson had decided to shift the focus of his research: GUS had grown out of his work on worm embryonic development, but Jefferson had concluded that the genetic activity of plants is far more interesting. He began, therefore, to apply for funding to adapt GUS for plants.
To his anger and dismay, however, it took two years to get the necessary funding — a career hiccup, he later discovered, caused by the lukewarm references that David Hirsh had been writing for him.
Incidents like this were eventually to convince Jefferson that academia is not the meritocracy it claims to be, but an old boy's club. Too often, he complains, scientists' career prospects hang on decisions made in a non-transparent way, and on a "You scratch my back" basis.
However, in 1985 Jefferson eventually got funding from the National Institutes of Health to go to the Plant Breeding Institute in Cambridge, England.
Bench jockeys
For personal reasons, Jefferson's time at PBI was emotionally difficult. It was, however, a very productive period of his professional life. Discovering that adapting GUS was fairly straightforward, he turned his attention to other matters. And practically everything he touched, he says, "turned to gold".
Most notably, on June 1st 1987 Jefferson became the first person in the world to successfully plant a transgenic food crop. In doing so, he beat biotech giant Monsanto to the punch by one day!
At PBI Jefferson also found himself working alongside a great many scientists from developing countries — an experience that was to convince him that researchers from the West routinely exploit their colleagues from less wealthy nations, using them as "bench jockeys" for their own ends.
The greatest victims of the academic Old Boys' Club, he concluded, are scientists from poorer nations, since those in the West are all too happy to stand on their backs to further their own careers.
This means, says Jefferson, that even those able to get to the West to do some research can generally only aspire to "do some science, publish a paper, and then disappear back into Africa or China, or wherever." Once back home they lack the necessary tools, the funds, and the opportunity to carry on with their research.
In the context of biotech, Jefferson concluded, this meant that those countries that had most to gain from molecular biology were the least likely to benefit from it.
Worse, this inequity was being exacerbated by an undesirable new development in science, as its traditional openness was giving way to a culture of secrecy and greed.
When he started in Carbon's lab, explains Jefferson, everybody shared data and not a single patent had been filed in the field. As the potential of biotechnology became apparent, however, a patenting frenzy had gripped the scientific community, with individual scientists and biotech companies falling over each other to secure intellectual property rights — not only in the basic tools of molecular biology, but in the raw material too.
There is no better example of the way in which core technologies were being appropriated than the fate of the two principle means for transferring genes into plants. The gene gun developed at Cornell University had been patented, and the rights then sold on to DuPont — a transaction that earned for Cornell more money than the University had ever earned in royalties before.
Meanwhile, a technique utilising Agrobacterium tumefaciens was itself rapidly being encircled by a sea of patents — patents that were later to become the subject of intense litigation — as Syngenta, Monsanto and Dow all fought over the rights.
While the increasing enclosure of the biotech commons was a growing source of frustration for scientists in the West, Jefferson saw that the consequences for developing countries were potentially devastating — since the hefty licensing fees required simply to engage in transgenesis threatened to lock them out of the considerable benefits that biotechnology promised, not least the ability to develop new plant varieties able to provide food security for their people.
More controversially, as initiatives like the Human Genome Project gathered pace, it was becoming evident that Western scientists and biotech companies were now intent on appropriating the very building blocks of life itself — by, for instance, patenting gene sequences.
Shared with the world
By now Jefferson had become convinced of the importance of making the basic tools of biotechnology freely available to all. Increasingly appalled at the way biotech was developing, he concluded that, whatever other people might do, he at least could act differently. In short, he decided to share GUS with the world.
So in 1987 he prepared 10,000 tubes of DNA sequences for use with GUS, wrote a comprehensive manual explaining how to use it with plants, and distributed lab packs to 500 research institutions around the world.
The result was instructive: Within a short space of time GUS was the most widely used reporter gene in the field. "Because Richard shared GUS freely, and because it worked effectively, everybody started using it," explains Gary Toenneissen, director of food security at the Rockefeller Foundation. "This meant that even though other reporter genes had become available, GUS was the tool of choice for most scientists."
In short, although at the time not conscious of the parallel, Jefferson had independently come up with the same strategy as the Free Software Foundation (FSF), which was later to blossom into the Open Source Software Movement. GUS became first choice for molecular biologists for the same reason as the Open Source server Apache has become the most widely used web server software on the Internet: it was freely available, and it worked!
Jefferson also began to receive "bug reports" about GUS, enabling him to improve it. In doing so he demonstrated that Linus' Law — "given enough eyeballs, all bugs are shallow" — is as applicable in biological innovation it is in software development. All in all, says Toenniessen, GUS was "a good example of how the Open Source software model can work in biotechnology."
As a consequence, GUS was to prove instrumental in helping scientists around the world create more efficient varieties of maize, wheat, rice, soybean and cotton — not least Western-based biotech companies like Monsanto, which used GUS to develop the now hugely successful and ubiquitous Roundup Ready soybean.
Excited by this turn of events Jefferson began to a hatch plan for a much grander project. Wouldn’t it be great, he thought, if he could generalise what he had achieved with GUS throughout biotechnology?
By now Jefferson had also had first-hand experience of what he characterises as the "vicious, sophisticated but untidy, manipulative, staggeringly money-driven" process of biotech patenting. For when the University of Colorado had declined to patent GUS Jefferson had done so himself.
Explaining his decision to do so today, Jefferson says that at the time he was somewhat naïve about intellectual property (IP). At the back of his mind, he explains, was a vague thought that by patenting GUS he could use the royalties to fund the creation of new inventions.
Whatever the reason, the experience was to prove an important milestone in his IP education, since it led him to enter into what he now refers to as a "horrible Faustian Pact" with a highly-regarded US patent attorney called Leslie Misrock — a pact that was to lead to a great deal of pain, and eventually legal action.
But this was all the more reason to try and do something about things. And the best way of doing so, Jefferson concluded, would be to create an organisation focused on encouraging and supporting greater sharing of core technology. But how and where?
By the late 1980s Jefferson had become sufficiently disillusioned with the Old Boy's Club that he saw no opportunity of creating such an organisation within academia. What was needed, he concluded, was a more conducive environment. Consequently, he says, "I decided to leave the star maker machine: professorships and stuff. I had come to hate it with a passion."
After hitting an emotional low point in 1988, Jefferson was rescued by his friend Stephen Hughes, whom he had first met in Edinburgh, in Ken Murray's lab. Hughes, he says, "dragged me off to Southern Italy with my tail between my legs."
At that time director of biotech research for an Italian food conglomerate, Hughes organised a six-month visiting professorship for Jefferson, and helped him sketch out his plans for the future.
"It was a very important time for me because Steve is a very creative man and gave me a lot of help," says Jefferson. "Many of the good ideas I came up with came out of discussions with Steve in those early days, in Mozzarella land."
Feeling that the kind of organisation he had in mind would be most effective if it were able to operate under the aegis of an international development body like the United Nations, in 1989 Jefferson accepted a post as the first molecular biologist with the UN's Food and Agricultural Organisation (FAO).
Sadly, the FAO proved a false start. Within a short space of time it became apparent to Jefferson that the UN was just one more old boys' club — with member nations more focused on pursuing their own narrow interests than in helping the world's less wealthy nations feed themselves.
Disappointed by his inability to make headway, and disillusioned by the incestuous politics of the UN, in 1991 Jefferson left the UN, having determined that he would need to create a private initiative. Out of this disappointment and disillusionment would be born CAMBIA and the BiOS initiative …
####
If you wish to read this interview in its entirety please click on the link below. I am publishing it under a Creative Commons licence, so you are free to copy and distribute it as you wish, so long as you credit me as the author, do not alter or transform the text, and do not use it for any commercial purpose.
If after reading it you feel it is well done you might like to consider making a small contribution to my PayPal account. I have in mind a figure of $8, but whatever anyone felt inspired to contribute would be fine by me. Payment can be made quite simply by quoting the e-mail account: richard.poynder@btinternet.com. It is not necessary to have a PayPal account to make a payment.
What I would ask is that if you point anyone else to the article then you consider directing them to this post, rather than directly to the PDF file itself.
If you would like to republish the article on a commercial basis, or have any comments on it, please email me at richard.poynder@btinternet.com.
I would like to acknowledge the help of the Open Society Institute, which provided a small upfront grant to enable me to get started on The Basement Interviews project. Further information about The Basement Interviews can be found at the Open and Shut? site.
To read the full introduction and interview in its entirety (as a PDF file) click here.
The first part of the introduction is being published here on my blog. The interview itself, including the full introduction, is available as a downloadable PDF file (see below for details). The interview is being published under a Creative Commons licence.
The Basement Interviews
Biological Open Source
Richard Jefferson, founder and CEO of CAMBIA, and leading light of the Biological Open Source Movement, talks to Richard Poynder
Richard Jefferson was born in California in 1956, the son of music promoter and producer Carl Jefferson. His mother, Hermeline, was a stage actress turned librarian.
Jefferson's parents divorced before he was born, so he and his two siblings lived with their mother in a single-parent household. As they were "financially challenged", says Jefferson, all the children had to pull their weight, and there were few treats. "I worked as a 4 am to 8 am paperboy most of my childhood. We never even had a family holiday — not one — and almost all my clothes were from 'goodwill' or Salvation Army."
After a brief period in what he refers to as "a pretty feeble Catholic elementary school", Jefferson entered the public school system, where he was put into a programme for "mentally gifted minors"
Although good at biology at school, Jefferson was more excited by physics and physical chemistry. Physics, he explains, offered "an underlying method in which you can distil the fundamental principles of life." By contrast, biology was just "a lot of cool observational stuff."
The best undergraduate
Jefferson's attitude to biology changed in 1974, however, when he went to the University of California in Santa Barbara (UCSB), and was exposed to what he calls "hardcore molecular biology." Specifically, one of the first lectures he attended was given by molecular biologist John Carbon, who talked about the research he had been doing on recombinant DNA during a recent sabbatical at Stanford University.
As he listened, Jefferson realised that Carbon was describing the same kind of "core unifying logic" that had thrilled him in his school physics, but had until now been absent from biology. Increasingly excited at what he was hearing, Jefferson began firing questions at Carbon, and the lecture turned into a two-way conversation, with the other students gazing on with glazed eyes.
The incident was sufficiently singular that Carbon recalls it vividly. As he explained to me by email: "I remember I talked about recent developments in recombinant DNA research (this was when that field was in its infancy). Rick Jefferson — as he was then known — asked several questions during the lecture, and then afterwards came up to talk with me, and to ask more questions. He was very excited about the research I described."
Jefferson was so enthralled, in fact that he immediately embarked on a campaign to persuade Carbon to let him work in his lab — an unheard of privilege for an undergraduate.
Eventually, says Carbon, "I invited him to help out, even though I had never taken a first year student into the lab previously. At first he helped out post-doctorals and grad students, but eventually he became more independent; more like a grad student when he was a senior. And he worked there until he graduated."
This was at the dawn of molecular biology, and Carbon's lab was one of only three or four labs in the world then working in the field.
Determined to learn as much as possible about molecular biology, Jefferson then began pestering the University of Edinburgh, in Scotland, to let him spend a year of his undergraduate studies in the lab of Ken Murray. "I managed — by perseverance — to push my way into a lab in a culture where undergrads just don't do that," says Jefferson. "And I spent a wonderful and instructive undergraduate year there when research on recombinant DNA was just beginning in Europe."
After returning to Santa Barbara and completing his degree, Jefferson moved to Boulder, Colorado, to do a PhD in the lab of David Hirsh.
It wasn’t long before Jefferson made an important contribution to molecular biology himself, developing a gene reporter system called GUS. This was revolutionary because it allowed molecular biologists for the first time to monitor exactly what was going on when they were trying to implant foreign genes into an organism. As Jefferson explains, before he developed GUS "people were just chucking stuff into blenders without any idea of what was happening!" As we shall, see GUS was later to become a key tool in the armoury of molecular biology.
At Boulder, Jefferson acquired a reputation for being a talented but somewhat maverick scientist. As the then head of department Bill Wood told me by email, "Richard was a flamboyant and impressive grad student here, who also drove us crazy at times."
Jefferson's view is that his colleagues simply didn’t understand his obsession with methodology, and so didn't respect or appreciate what he was doing. "Everything in science is determined by the tools that fall into scientists' hands," he says. "And GUS is an example of really good methodology; it's a great tool."
In other words, scientists can be as brilliant as they like, but without the right tools they are limited in what they can achieve. Yet most researchers remain focused exclusively on the sexy business of pushing back the frontiers of knowledge, not on the quotidian task of creating the tools to enable cutting edge discoveries to be made. "Methodology has always been really dissed in science, and yet it is so important," complains Jefferson.
By the time he finished at Colorado, Jefferson had decided to shift the focus of his research: GUS had grown out of his work on worm embryonic development, but Jefferson had concluded that the genetic activity of plants is far more interesting. He began, therefore, to apply for funding to adapt GUS for plants.
To his anger and dismay, however, it took two years to get the necessary funding — a career hiccup, he later discovered, caused by the lukewarm references that David Hirsh had been writing for him.
Incidents like this were eventually to convince Jefferson that academia is not the meritocracy it claims to be, but an old boy's club. Too often, he complains, scientists' career prospects hang on decisions made in a non-transparent way, and on a "You scratch my back" basis.
However, in 1985 Jefferson eventually got funding from the National Institutes of Health to go to the Plant Breeding Institute in Cambridge, England.
Bench jockeys
For personal reasons, Jefferson's time at PBI was emotionally difficult. It was, however, a very productive period of his professional life. Discovering that adapting GUS was fairly straightforward, he turned his attention to other matters. And practically everything he touched, he says, "turned to gold".
Most notably, on June 1st 1987 Jefferson became the first person in the world to successfully plant a transgenic food crop. In doing so, he beat biotech giant Monsanto to the punch by one day!
At PBI Jefferson also found himself working alongside a great many scientists from developing countries — an experience that was to convince him that researchers from the West routinely exploit their colleagues from less wealthy nations, using them as "bench jockeys" for their own ends.
The greatest victims of the academic Old Boys' Club, he concluded, are scientists from poorer nations, since those in the West are all too happy to stand on their backs to further their own careers.
This means, says Jefferson, that even those able to get to the West to do some research can generally only aspire to "do some science, publish a paper, and then disappear back into Africa or China, or wherever." Once back home they lack the necessary tools, the funds, and the opportunity to carry on with their research.
In the context of biotech, Jefferson concluded, this meant that those countries that had most to gain from molecular biology were the least likely to benefit from it.
Worse, this inequity was being exacerbated by an undesirable new development in science, as its traditional openness was giving way to a culture of secrecy and greed.
When he started in Carbon's lab, explains Jefferson, everybody shared data and not a single patent had been filed in the field. As the potential of biotechnology became apparent, however, a patenting frenzy had gripped the scientific community, with individual scientists and biotech companies falling over each other to secure intellectual property rights — not only in the basic tools of molecular biology, but in the raw material too.
There is no better example of the way in which core technologies were being appropriated than the fate of the two principle means for transferring genes into plants. The gene gun developed at Cornell University had been patented, and the rights then sold on to DuPont — a transaction that earned for Cornell more money than the University had ever earned in royalties before.
Meanwhile, a technique utilising Agrobacterium tumefaciens was itself rapidly being encircled by a sea of patents — patents that were later to become the subject of intense litigation — as Syngenta, Monsanto and Dow all fought over the rights.
While the increasing enclosure of the biotech commons was a growing source of frustration for scientists in the West, Jefferson saw that the consequences for developing countries were potentially devastating — since the hefty licensing fees required simply to engage in transgenesis threatened to lock them out of the considerable benefits that biotechnology promised, not least the ability to develop new plant varieties able to provide food security for their people.
More controversially, as initiatives like the Human Genome Project gathered pace, it was becoming evident that Western scientists and biotech companies were now intent on appropriating the very building blocks of life itself — by, for instance, patenting gene sequences.
Shared with the world
By now Jefferson had become convinced of the importance of making the basic tools of biotechnology freely available to all. Increasingly appalled at the way biotech was developing, he concluded that, whatever other people might do, he at least could act differently. In short, he decided to share GUS with the world.
So in 1987 he prepared 10,000 tubes of DNA sequences for use with GUS, wrote a comprehensive manual explaining how to use it with plants, and distributed lab packs to 500 research institutions around the world.
The result was instructive: Within a short space of time GUS was the most widely used reporter gene in the field. "Because Richard shared GUS freely, and because it worked effectively, everybody started using it," explains Gary Toenneissen, director of food security at the Rockefeller Foundation. "This meant that even though other reporter genes had become available, GUS was the tool of choice for most scientists."
In short, although at the time not conscious of the parallel, Jefferson had independently come up with the same strategy as the Free Software Foundation (FSF), which was later to blossom into the Open Source Software Movement. GUS became first choice for molecular biologists for the same reason as the Open Source server Apache has become the most widely used web server software on the Internet: it was freely available, and it worked!
Jefferson also began to receive "bug reports" about GUS, enabling him to improve it. In doing so he demonstrated that Linus' Law — "given enough eyeballs, all bugs are shallow" — is as applicable in biological innovation it is in software development. All in all, says Toenniessen, GUS was "a good example of how the Open Source software model can work in biotechnology."
As a consequence, GUS was to prove instrumental in helping scientists around the world create more efficient varieties of maize, wheat, rice, soybean and cotton — not least Western-based biotech companies like Monsanto, which used GUS to develop the now hugely successful and ubiquitous Roundup Ready soybean.
Excited by this turn of events Jefferson began to a hatch plan for a much grander project. Wouldn’t it be great, he thought, if he could generalise what he had achieved with GUS throughout biotechnology?
By now Jefferson had also had first-hand experience of what he characterises as the "vicious, sophisticated but untidy, manipulative, staggeringly money-driven" process of biotech patenting. For when the University of Colorado had declined to patent GUS Jefferson had done so himself.
Explaining his decision to do so today, Jefferson says that at the time he was somewhat naïve about intellectual property (IP). At the back of his mind, he explains, was a vague thought that by patenting GUS he could use the royalties to fund the creation of new inventions.
Whatever the reason, the experience was to prove an important milestone in his IP education, since it led him to enter into what he now refers to as a "horrible Faustian Pact" with a highly-regarded US patent attorney called Leslie Misrock — a pact that was to lead to a great deal of pain, and eventually legal action.
But this was all the more reason to try and do something about things. And the best way of doing so, Jefferson concluded, would be to create an organisation focused on encouraging and supporting greater sharing of core technology. But how and where?
By the late 1980s Jefferson had become sufficiently disillusioned with the Old Boy's Club that he saw no opportunity of creating such an organisation within academia. What was needed, he concluded, was a more conducive environment. Consequently, he says, "I decided to leave the star maker machine: professorships and stuff. I had come to hate it with a passion."
After hitting an emotional low point in 1988, Jefferson was rescued by his friend Stephen Hughes, whom he had first met in Edinburgh, in Ken Murray's lab. Hughes, he says, "dragged me off to Southern Italy with my tail between my legs."
At that time director of biotech research for an Italian food conglomerate, Hughes organised a six-month visiting professorship for Jefferson, and helped him sketch out his plans for the future.
"It was a very important time for me because Steve is a very creative man and gave me a lot of help," says Jefferson. "Many of the good ideas I came up with came out of discussions with Steve in those early days, in Mozzarella land."
Feeling that the kind of organisation he had in mind would be most effective if it were able to operate under the aegis of an international development body like the United Nations, in 1989 Jefferson accepted a post as the first molecular biologist with the UN's Food and Agricultural Organisation (FAO).
Sadly, the FAO proved a false start. Within a short space of time it became apparent to Jefferson that the UN was just one more old boys' club — with member nations more focused on pursuing their own narrow interests than in helping the world's less wealthy nations feed themselves.
Disappointed by his inability to make headway, and disillusioned by the incestuous politics of the UN, in 1991 Jefferson left the UN, having determined that he would need to create a private initiative. Out of this disappointment and disillusionment would be born CAMBIA and the BiOS initiative …
####
If you wish to read this interview in its entirety please click on the link below. I am publishing it under a Creative Commons licence, so you are free to copy and distribute it as you wish, so long as you credit me as the author, do not alter or transform the text, and do not use it for any commercial purpose.
If after reading it you feel it is well done you might like to consider making a small contribution to my PayPal account. I have in mind a figure of $8, but whatever anyone felt inspired to contribute would be fine by me. Payment can be made quite simply by quoting the e-mail account: richard.poynder@btinternet.com. It is not necessary to have a PayPal account to make a payment.
What I would ask is that if you point anyone else to the article then you consider directing them to this post, rather than directly to the PDF file itself.
If you would like to republish the article on a commercial basis, or have any comments on it, please email me at richard.poynder@btinternet.com.
I would like to acknowledge the help of the Open Society Institute, which provided a small upfront grant to enable me to get started on The Basement Interviews project. Further information about The Basement Interviews can be found at the Open and Shut? site.
To read the full introduction and interview in its entirety (as a PDF file) click here.