Wednesday, December 4, 2013

The Power of Platforms

In Product and System Design, a platform is a structuring principle or foundation on top of which a set of independent elements and interfaces can be arranged, rearranged, and innovated upon (see [1] and [2]). Shared key components and assets define the core of the platform and “diversification can be achieved by building upon and extending capabilities to build new, but related foundations“ [3]. Baldwin et. al. [4] point out that most platform definitions identify the reuse or sharing of common elements as core characteristics, and that all platforms are “modularizations of complex systems in which certain components (the platform itself) remain stable, while others (the complements) are encouraged to vary in cross-section or over time“ [4]. Well-structured platforms allow numerous advantages, such as cost saving, increased production efficiency, ability to evolve and produce variety in large scale (see [5]).

In Computer Science, this term was initially used to define the computing hardware and later on the operating system (OS) upon which application programs would run.
Earlier computers had to be designed and built from the ground up always when new releases were planned. Not infrequently, because of incompatibility with newer systems, there was a costly process involved in moving data to different formats. In the early 1960s, IBM created the first modular computer, the System/360. This model was a landmark in the computer industry, because it “offered the possibility for other firms to enter and compete in the computer marketplace, on the basis of providing modules that would ‘slot in’ or ‘plug in’ IBM System/360 architecture.” [5]

As computer hardware was becoming modular and increasing its complexity, systems without OSs presented enormous challenges for programmers. Earlier computers required the full hardware specification to be attached to the application every time it was executed. Therefore, a program not only would be suitable for one machine, but also every time someone needed to run the program, it was necessary the whole process of loading commands specific for that computer was be necessary. As a way to optimize computer systems, the industry realized that some fundamental set of instructions that concerned hardware management could be loaded into the memory and managed separately. The distinction between applications and OS was then created.

The reuse of code and the modularization of computer systems brought many advantages to the field especially regarding cost, manageability, and evolution of complex systems. According to Baldwin et. al., “by promoting the reuse of core components, such partitioning can reduce the cost of variety and innovation at the system level. The whole system does not have to be invented or rebuilt from scratch to generate a new product, accommodate heterogeneous tastes, or respond to change in the external environment. In this fashion, the platform system as a whole becomes evolvable: it can adapt at low cost without losing its identity or continuity design.“ [4].

Developers were freed of rewriting Kernel instructions for every single application, and could use the OS as a platform upon which several programs that shared the same specifications could run. Not only that, the same OS, as was the case of e.g. MS-DOS, could be used across a large variety of hardware systems from different vendors, provided that those were “IBM compatible”. As Gawer [5] points out, “in the process of modularizing the system architecture, the modules are defined and the rules of interaction between modules are clarified. The interface specifications formalize, codify, and therefore reveal the rules of interaction between modules. Once this information is revealed and codified, it becomes easy to transfer, -- but also becomes hard to protect.” [5] It certainly was hard for IBM to protect its platform, especially because of the highly innovative ecosystem it had promoted.

The modularity of computer systems became even more robust with the advent of Object-Oriented Programming (OOP) and its concepts such as encapsulation, polymorphism, inheritance, and so on. OOP took the reusability of software components to a next level. Powerful software development platforms or frameworks reduced enormously the complexity and cost of writing code. According to Pree [6], frameworks “mean a real breakthrough in software reusability: not only single building blocks but whole software (sub-)systems including their design can be reused.”[6]

The inexpensive and powerful strategies allowed by frameworks create a rich ecosystem for application development that lead to an enormous variety and innovation, especially when it is popular and open to a great number of developers. Rich ecosystems can be observed especially in mobile platforms such as iOS, Android, Windows Phone, and so on. Both the Apple’s App Store and the Android Marketplace count already with more than 1 million different apps[1] [2]. These considerable numbers of mobile apps were only reached when the leading companies in the smartphone field decided to partially open their systems to outside developers who had “skills, capabilities, or an understanding of user needs that those inside of the firm do not“ [4]. By allowing new constructions and manipulations of building blocks inside the mobile platforms, not only the degree of innovation, scalability, and evolvability increased but also new device releases raised by a factor of five (see [4]).



[1]        D., Phillips, Nelson, Sewell, Graham, Woodward, Joan Griffiths, Technology and organization: essays in honour of Joan Woodward. Bingley [u.a.: Emerald, 2010.
[2]        O. L. de Weck, E. S. Suh, and D. Chang, “Product family strategy and platform design optimization,” MIT Working Paper, 2004.
[3]        M. H. Meyer and J. M. Utterback, “The product family and the dynamics of core capability,” 1992.
[4]        C. Y. Baldwin and C. J. Woodard, “The architecture of platforms: A unified view,” Platf. Mark. Innov., pp. 19–44, 2009.
[5]        A. Gawer, “Towards a General Theory of Technological Platforms,” in Druid Summer Conference, 2010.
[6]        W. Pree, “Meta Patterns - A Mean For Capturing the Essentials of Reusable Object-Oriented Design.”




[1] Accessed in November 25, 13. http://www.theverge.com/2013/10/22/4866302/apple-announces-1-million-apps-in-the-app-store
[2] Accessed in November 25, 13. http://mashable.com/2013/07/24/google-play-1-million/

Thursday, September 26, 2013

The Global Brain


 Visualization of a partial map of the Internet*

Due to new communication technologies, individuals from all over the globe collaborate in a diversity of ways never possible before in humankind history. On the Internet, the Collective Intelligence is naturally manifested not only by single and collaborative works built upon previous knowledge, but also in the human-human and human-computer relationships that occur on the network.

The vision of the past about human-computer symbiosis turned out to be to a certain extent revealing. In 1960, Licklider [1] in his article Man-Computer Symbiosis predicts that human brains and computers would be coupled tightly together and that “the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today” [1]. The world is already deeply interconnected by the Internet and social media, and this is promoting discussions and numerous revolutions and movements across the globe[1] with unprecedented outcomes.

Man-made networks connect millions of individuals, who themselves are made of a combination of different kinds of biological networks. Neural networks of the human brain share similarities with the Internet regarding their structure and the way they function. Eguiluz et. al. describe super connected brain regions in neural networks that appear to work as “hubs” facilitating the communications between distant “nodes” or specific and less connected brain areas (see [2]). Those “hubs” operate in a comparable way to search engines (e.g. Google) in that they do not hold the needed information, but point to the location to where it is. In addition to that, recent research has shown that some laws that govern complex network systems’ growth and dynamics are also similar, no matter if those networks are the human brain or the Internet (see [3]).

Macro and micro resemblances might hold the answer why we are so confortable to spend hours online, especially the new generation that is born in the Internet age. This special configuration that mixes biology and technology, the “Global Brain” [4], holds the potential “to become truly transformative in domains from education and industry to government and the arts” [4]. The Global Brain, which is the result of human-computer symbioses, has become increasingly present not only on the web but in our daily lives in very subtle ways with the help of the new generation of mobile devices. The world is becoming increasingly mediated by technology that is already recommending the best restaurants in town, what movie is worth to watch, the best route to avoid traffic, and in the near future will drive our cars wherever we would like to go.

Therefore it is of great significance to understand what are the pillars of the human-computer symbiosis and its collective intelligence. O’Reilly [5] points out to the architecture of the new and smart Global Brain that has both individuals and the network feeding and being fed, processing information and mediating each other. Those pillars of the collective intelligence of the network and its structures can be found in some landmark developments of the computer history such as the open-source philosophy, collaborative systems, and platform architectures.

*The Opte Project. The map is based on Internet IP addresses. Each line is drawn between two nodes, and each node represents an IP address. Image taken from: http://www.marleenandela.com/img/1105841711.LGL.2D.1024x1024.png

[1] From Makers Movement to the Arab Spring and the Occupy Wall Street.


References:

[1]        J. C. R. Licklider, “Man-Computer Symbiosis,” Ire Trans. Hum. Factors Electron., vol. HFE-1, pp. 4–11, Mar. 1960.
[2]        V. M. Eguiluz, D. R. Chialvo, G. A. Cecchi, M. Baliki, and A. V. Apkarian, “Scale-free brain functional networks,” Phys. Rev. Lett., vol. 94, no. 1, p. 018102, 2005.
[3]        D. Krioukov, M. Kitsak, R. S. Sinkovits, D. Rideout, D. Meyer, and M. Boguná, “Network Cosmology,” Nature. [Online]. Available: http://www.nature.com/srep/2012/121113/srep00793/full/srep00793.html. [Accessed: 15-Sep-2013].
[4]        A. Bernstein, M. Klein, and T. W. Malone, “Programming the Global Brain,” Commun. Acm May 2012 Vol 55 Issue 5, 2012.
[5]        T. O’Reilly, “The Future of Cooperation: A Talk by Tim O’Reilly sponsored by CITRIS (Center for Information Technology Research in the Interest of Society) - YouTube,” YouTube. [Online]. Available: http://www.youtube.com/watch?v=0D3_2iCldUA. [Accessed: 24-Sep-2013].

Thursday, September 19, 2013

Web-based Collaborative Platforms: Nupedia vs. Wikipedia

Web-based collaborative platforms are systems designed to support individuals to accomplish tasks in a cooperative manner on the Internet. Those applications may also support communication and coordination of work produced collaboratively between two or more people. Basfoutsou et. al. [1] and Moreno-Llorena et. al. [2] propose five different categories of collaborative systems according to their functionalities:

  • Group File and Document Handling (GFDH) Tools: they support the management of files and documents that are handled by groups. It includes shared-view, synchronous or asynchronous editing, and notification mechanisms when files are altered (see [1]).

  • Computer Conferencing (CC) Tools: they allow synchronous and asynchronous communication via text, audio, or/and video. Shared desktop or whiteboards as well as protocols for send and receiving files are common functionalities for these kinds of tools (see [1]).

  • Electronic Meeting System (EMS) Tools: usually used by private corporations, they facilitate brainstorming, problem solving, decision-making by voting, action planning, surveys, production of documentation, among other functionalities (see [1]).   

  • Electronic Workspace (EW) Tools: they provide groups with virtual working spaces where documents and files can be centrally stored. In addition, those platforms can support to-do lists, calendars, address books, and workspaces for specific projects (see [1]). 
 
  • Online Social Network Systems (OSN) Tools: they connect users and facilitate “sharing and finding of content, and disseminating information” [2]. Popular social network sites are examples for this category, which also provide bulleting boards, synchronous/asynchronous messaging, among other functionalities, such as voting and the creation of events.

The free-content encyclopedia Wikipedia is a successful example of a GFDH tool based on an openly editable model that came from the open source experience, which sees the users as potential contributors. Wikipedia covers an enormous amount of subjects. More than 22,000,000 articles were written so far in 285 languages. As in open source software projects, the outcome of the collaborative process lead to the creation of articles “of remarkably high quality“ [3, p. 21]. This large number of publications and quality were only possible due to its approximately 77,000 active contributors[1].

Originally, Wikipedia was proposed as a complementary project for its predecessor Nupedia as a mean to generate faster and larger amounts of content. Although also thought to be collaborative, Nupedia was not able to release consistent amount of articles during its existence. The main problems Nupedia had were its highly bureaucratic publication procedure, elitist selection of contributors, and non-wiki platform. Nupedia’s content approval process had seven complicated steps that encompassed editing, fact-checking, and peer reviewing. In addition, most of its articles were written by experts with PhD. Although non-expert individuals could also contribute with articles, they would most likely be vetted with only few cases of exception.

Therefore, Jimmy Wales and Larry Sanger[2] decided to run Wikipedia to facilitate the productions of articles that later would go for the Nupedia’s reviewing process. Basically, the idea was to find a way that “uncredentialed  people could participate more easily” [4]. The differential of Wikipedia relied on its deep open-source philosophies, which considers everything as a “draft in progress, open to revision. A centralized authority does not approve projects before they are launched, but rather decentralized authority improves them constantly.” [5]. Impressively, the quality of the articles and the fact the Wikipedia is not prone to amateurism and vandalism is due to the community that is “passionate about the topics they know and care about”, and  “tends to trump both inaccuracy and vandalism over time” [5].
The success of Wikipedia relies on its wiki architecture, and clear policy and guidelines. A wiki (from Hawaiian “quick”) is a type of Content Management System (CMS) application that allows individuals to add, edit, or delete content in a cooperative manner. Individuals are driven to cooperate if not with writing totally new publications, then with linking keywords among articles, fixing spelling and grammar mistakes, or improving clarity of sentences (see Figure 2).

Figure 2 Welcome page of Wikipedia

Nothing else holds Wikipedia from being just a wiki than its policies and guidelines. According to Sanger [4], Wikipedia has a different culture from regular wikis, because “it’s pretty singlemindedly aimed at creating an encyclopedia.” [4]. Although the architecture of wiki software encourages openness and de-centralization allowing deviant content being fed into Wikipedia, the community of wikipedians is compliant with the Wikipedia’s five fundamental pillars. The rules that define the principles of the project are[3]: 

  • Wikipedia is an encyclopedia.
  • Wikipedia is written from a neutral point of view.
  • Anyone can edit, use, modify, and distribute.
  • Editors should treat each other in a respectful and civil manner.
  • Wikipedia does not have clear rules.

Before being Wikipedia, the project was launched as Nupedia’s wiki branch, but soon it received criticism for being too informal and unstructured. The advisory board did not want to have a wiki as part of Nupedia, therefore Wikipedia was re-launched as an independent project in January 15, 2001[4]. In comparison, Wikipedia’s flexibility and dynamism helped it to reach 18,000 published articles in its first year. Nupedia released only 21 articles during its first year. As Rettberg [6] points out, Nupedia “failed to trust the collective intelligence of the network.“ [6, p. 199].




[1] Data extracted from www.wikipedia.com. Accessed on September 2013.

[2] Jimmy Wales and Larry Sanger were the founders of both Nupedia and Wikipedia projects.

[3] Data taken from http://en.wikipedia.org/wiki/Wikipedia:Five_pillars


[4] Data from http://en.wikipedia.org/wiki/Wikipedia:About#Wikipedia_history

References


[1]        G. Bafoutsou and G. Mentzas, “Review and functional classification of collaborative systems,” Int. J. Inf. Manag. 22, 2002.
[2]        J. Moreno-Llorena, C. Iván, R. Martín, R. Cobos, J. de Lara, and E. Guerra, “Towards a Functional Characterization of Collaborative Systems,” in Cooperative Design, Visualization, and Engineering, pp. 182–185.
[3]        T. W. Malone, R. Laubacher, and C. Dellarocas, “The Collective Intelligence Genome,” Mit Sloan Manag. Rev. Spring, vol. 51, no. 3, 2010.
[4]        L. Sanger, “The Early History of Nupedia and Wikipedia: A Memoir - Slashdot.” [Online]. Available: http://features.slashdot.org/story/05/04/18/164213/the-early-history-of-nupedia-and-wikipedia-a-memoir. [Accessed: 13-Sep-2013].
[5]        S. Rettberg, “All together now: Collective knowledge, collective narratives, and architectures of participation,” Digit. Arts Cult., 2005.
[6]      New narratives: stories and storytelling in the digital age. Lincoln: University of Nebraska Press, 2011.