\ \ \
 
     
 
 

Beyond data distribution

notes on the developments and contexts for net distribution systems
Zeljko Blace

text originally drafted for CAT's MEAOW event @ NYU in April 2003
http://cat.nyu.edu/meaow/

current version is published and open for comments on:
http://tamtam.mi2.hr/zblace/BeyondDataDistribution

Intro of MEAOW:

“What are the possibilities for internet based distribution and production of video and audio?

Napster, Gnutella and their descendants have famously demonstrated the sheer scale of p2p filesharing systems, and the difficulties of exploiting them for the benefit of traditional entertainment products under traditional intellectual property regimes. However, less attention has been paid to the emerging audio and video products and the new genres of cultural product that exploit netbased distribution and production. This panel will survey different experiments and projects in this realm, in particularly projects that are designed to promote and sustain diverse cultural resources, generating demonstrable social value.”

When thinking of the phenomena of “p2p revolution” that have become ubiquitous in Internet news portals and experts technology analyses in the past 2 years, memories of early age of web in the mid nineties come to my mind, along with the prophecies of its future evolutions. Many of those concepts fell through within following years and where easily forgotten to make room for new hype. However, some remained in a technological conceptual form or as a part of digital culture heritage.

I joined the "net class" about one month before Netscape introduced background tag in their web browser and the aesthetic impact it had on numerous web pages was a first exclusively net phenomenon I was aware of. Web design limited at first to a simple formatting of text with the efficiency in mind (only rarely images would show up as an important part of the overall style) seemed like only option for a long time. However, the introduction of new tag seemed to have initiated unavoidable change in how we came to perceive the web ever since. Was this the first web-revolution? The crucial point when designers as form-focused professionals and their businesses started selling web as visual media - as if the communicational potential of media was too hard to market without the spectacular visions of cybernetic networks?

As time passed the web developed gradually, loosing some of its early charm where every other website had sign “under-construction”. Web was not to be a process any more, but a product with all of the marketing, branding and wrapping that comes inherently with it.

On-line environment started changing from its .gov/.mil/.edu origins to a web dominated by .com, with several big ICT corporations dominating early markets IT (mostly SUN and IBM, but also fast-growing ones like Netscape and late arriving Microsoft always ready to catch up and build upon rival’s work). At that point (moving into the late 90-ties) content industries bloomed on the IT bubble of new economy. Wired was predicting death of web in favor of new “push channels” technologies (remember Wired Netscape channel or Marimba?), which they bravely developed just before grand fiasco of this “future” technology. Realizing that the content is scarce and that that information not only wants to be free (as a community of web developers escaped WIRED's control to form evolt.org), but also needs a lot of financial investment.

Economy has its rules even in virtual markets and though the production of content might get cheaper it will never get close to zero, unlike distribution and consumption costs that are getting closer to it all the time. Instead of big content industries, community driven websites like Slashdot.org established a new form of content production – aggregating it from a community of dedicated readers and sharing it with a wider community. Welcome to the .org era!

CULTURAL ORIGINS:

from academic reviews of early web to hotline and freenet connectives

Ever since the early days of web (and primary focus of its founder Tim Berners Lee), information distribution was an essential community process for its early adoption. For the sake of academic peer reviewing, a model was developed between a few networked physicists: World Wide Web established a platform that will soon radically change how we perceive global systems of information and knowledge distribution. In its development through the 90-ties this was often forgotten, but remained an important part of practices of smaller often marginalized communities. It is not without the reason that the Mac user community had a higher need for sharing and collaboration in times of Apple's biggest crisis HotLine was developed out of the need to easily share personal files while chatting with fellow users (cohesive moment in a crisis), at the moment when Apple was unable to provide quality distribution methods or support and was going downhill. After the method was established and benefits of sharing became evident the whole system was there to stay and be replicated by others. When the safety of protocols was in danger, the newly formed networks of p2p evangelists liberated concept once more with the introduction of a set of clones that later diversified in variants, of which Freenet became the most outstanding. The model of "review for benefit of research" was lifted from the academic community for the technical sphere, to be transformed one decade later into "share for the benefit of consumption". The shift of focus seemed to be linear but recent experiences of sharing the intellectual work for joint benefit (projects like wikipedia and gnupedia), show that this structure embedded in systems goes beyond simple drifting from academic reviewing to prosumeristic file-sharing of mp3 or divx files.

ECONOMICS OF DEVELOPMENT:

open vs. proprietary >> swarm intelligence vs. dynamics of marketing cycles

To distribute the findings/results of one's work is a most basic practice of the non-commercial and academic developers in the ICT field, where highly mature FLOSS (Free/Libre/OpenSource Software) development procedures almost serve as a rule. But as closed source and proprietary technologies (still basis for the most of commercial development) is adapting to a new form of shared source development of software, it is interesting to see what are the other possible models of production and when/why they loose grounds on daily basis. As an example lets take the earlier mentioned push-channels technologies and p2p technologies. While the business develop technologies in fixed cycles, where each time fixed loop consists of two parallel circles of the marketing department with its market research and advertising activities, and the development department with its plan, develop and “demo or die” phase. If these two departments are out of sync or have different rhythms the product for the end user is not delivered in either a stable form or in a due timing (which was a reason for premature death of many interesting technological developments). Development is narrowed down to a framework of marketing department’s presumptions of what end user needs or could need in the near future. Once the product is on the market it has a limited time to be adopted by a wide user base or it is discontinued, not to be offered again (insert random Darwinian evolution metaphor?). On the other hand a wide, heterogeneous and incoherent environment of FLOSS development works similar to the cultural practice, as it tends to expand in all possible directions, all different rhythms, contexts and becomes open for endless recycling of ideas, adaptations to situations, ready to die on its granular level and be resurrected in a new project. Although its efficiency is often disputable in the short run, its maturity is evident in the success of few major projects that overcame these obstacles, projects such as: GNU/Linux operating system, GIMP bitmap graphics manipulation program, Apache web server and Mozilla web browser. Within these dynamics, the concept of push-channels failed not only because of their bad timing and unpolished products/services delivered first time around, but also because they only had one chance to became a feasible money making technology (many of the clients didn't developed beyond the second version). In the same time, p2p concepts have had numerous incarnations with different successes and have built on the code and concepts of their predecessors due to a lack of copyright restrictions. The economy of open development has saved the idea of p2p networks from direct legal constrains that have prevented Napster to become a single implementation market standard, benefiting different FLOSS implementations.

POLITICS AND COALITIONS:

peer2peer technology as bus stop on information super-highways

One of the important aspects of the 'free market economy', according to which the ICT mostly operates now, is an ever changing policy on p2p issues, which proceeds at once in a very political manner (with strict implementation of legislation), as well as in very pragmatic manner (aware of the potential commercial benefits of users "avoiding" these same laws). Sometimes a random mix of these approaches makes unsuall coalitions possible. At the turn of the century most of the ISPs and PC hardware vendors were in a growth crisis as their market wasn’t growing as strongly as predicted. Net access and hardware manufacturing can get cheaper only up to a point before they become unsustainable, beyond that further cutting of prices can be fatal as users tend to pay only less and never higher prices once the service has been established (especially if the open market competition is pushing advancement of technology). Big corporations and state institutions tend to use outdated technologies as long as they work and even y2k issue didn’t produce enough pressure for most to upgrade. So what did make them upgrade and eventually re-expand the market of hardware and net services? It was the ease of access and distribution of high quality content. After the mp3 became standard and established a system of dissemination where the exchange was no longer point-to-point FTPing or rsyncing media folders with friends, but running a p2p client in networks of random unknown users, the whole field exploded. DivX instantly became the next video standard and made a big leap in file size transfer and CPU usage as Pentium 100MHz (enough for mp3 playing) had to be replaced by Pentium II/III/IV which showed (and need) strength to the maximum only to play/encode full quality video broadcasts. With this turn-over the RIAA and other partners in the copyright (DRM) coalition of Hollywood industries gained a new enemy in the strongest corporate sector of Silicon Valley; for the first time in recent history, business vs. business war resulted in direct benefits for end users. This is becoming even more evident as Microsoft and Apple bundle authoring tools with their new operating systems (Movie Maker and iMovie for video), encouraging users for the first time to get their hand dirty in media authoring and distribute/publish their work. How long will this war last before the Californian businesses start synching their efforts is hard to say, but for the end user it is the same song over again “Enjoy it while you can!”


TACTICS OF GRANULAR RESISTANCE:

clients that disturb the backbones

In the recently acquired net space filled with digital music and video, warez software, populated by millions of bit pumping and sucking p2p clients a new force has emerged, strong enough to disturb the big backbones, yet not associated with any of the Jedi knights.Disproportional technical simplicity and power of p2p clients has in a short time strongly reshaped the bandwidth usage readings in entire regions, allocating new spaces for data transfers and drawing the new maps where no traffic hogs have ever happened. From the perspective of the Eastern Europe, known for its loose regulations and piracy, the impact was more than visible. The Croatian Academic and Research Network expanded its bandwidth by over 10 times (in the act of joining the European association of academic networks) and in the same week had 96% of it’s new capacity filled by p2p traffic. After an instant analysis of these findings and looking at the network traffic, it was discovered that at the core of the network University of Informatics and Computing in Zagreb students keep leaving p2p clients running for 24 hours with their hard drives filled with copyrighted media (which then act as massive storage spaces) and providing transfer points for file transfers from the East Coast to West Coast of USA. The trend was later stopped by several measures including prohibiting use of p2p clients on publicly accessible workstations and penalties to those students who didn't abide to these new regulations. However this example illustrates only a fragment of impacts that p2p technologies have had on the infrastructure and bandwidth resources, while more political/activistic use is still to be developed (especially in the field of p2p streaming), researched and analyzed.


AESTHETICS OF DISTRIBUTION:

netcasts as hybrids

In one of the recent aesthetic theory bestsellers “The Language of New Media”, Lev Manovich discussed the use of “digital” and the presumptions of loss-less digital copying, which in his opinion is far from reach. However he doesn’t go into differentiating copying and distributing in the case of net streaming and leaves the field untouched for others to come. One of the most interesting aspects of netcasts (or web streaming technologies) is that although they culturally/socially/aesthetically are regarded as performative activities (real-time interaction), on the basic technological level these are primarily activities of data distribution. Networks are filed with multiple instances of packages and only receptive clients that tolerate missing packets of data connect the load that reaches them first into continuous flow of audio and video regularly disrupted by glitch artifacts. However, being focused on the content (discontinuous or not) we accept the glitch as inherent aesthetics of streaming media we use, and often even go further into a creative use of graphical/sonic artifacts up to a point of addressing it as THE content. By doing so we are providing a framework for aesthetic evaluation of "lossy" distribution method and establish qualities (glitch) unknown to previous networked art practices like mail-art or more recent net.art.

Conclusion? Before reaching out for a next technology or distribution paradigm which introduces innovative practices and behaviors that will in turn shape future network formations, it would be important to learn from existing technologies and phenomena introduced by them.Though distribution technologies are in constant flux and businesses are catching up or overtaking independent and outonomus modelsin this race, there is always more at stake than just simple technological advantage. If the wireless network technology and satellites overtake the existing earthly networks of copper and optic fibers networks has always been less a technological issue than an economic, political, cultural, tactical and even aesthetic issue, arising from the processes of implementation and interaction. But what makes artistic and activist practice so important is its advantage in being able to conceptualise new models that develop from different spheres and perspectives within society.