The Scrutiny of Current Video Games Systems & their Foreseeable Future

Once again the controlling power of video games has shifted and this time form electronics giant Sony to the computer giant Microsoft. For the first time since the first Sony Playstation, Sony have been trumped by the computer company in terms of technology.  This meant they needed to hit back harder with their next generation console.

the xbox 360 was the first HD, 60fps console to be released.

the xbox 360 was the first HD, 60fps console to be released.

But to everybody’s surprise Microsoft struck first. In November 2005 the XBOX 360 was released and most impressively on the very same day worldwide. The seventh generation console was once again beefed up with HD support, a DVD player,  512MB of ram and broadband compatibility with wireless controllers and a whopping 320GB optional hard drive. This now meant that video games consoles could evolve and develop, pro-longing their lifespan, making it one of that longest lasting consoles to date.

Almost one year later Sony released its PlayStation 3, however the console was more expensive with almost the exact same spec as the Xbox 360. It only had 50MHz more Graphics capability but contained a 500GB SATA Hard Drive as well as new wireless controllers. One of the most impressive features it had which the Xbox didn’t have was an installed Blu-Ray player.

ps3-consoles

Above: bothe the PS3 and PS3 Sim

However for the first time, technical superiority didn’t tie Sony’s over in sales. Two of the biggest reasons were that the PS3 costing around €600, €200 more than the Xbox 360. And most important , Sony had forgot about the most important selling point, the gaming experience and that the new console lacked backwards-compatibility meaning that people who maybe traded their PS2 console in for a PS3 could no longer play their past games where Xbox could. Customers saw no merit now in buying a more expensive console and jumped ship to the cheaper Microsoft device.

Nintendo’s Tangent

nintendo_wii

the wii helped make simple games once again popular & successful

After the failure of the GameCube in 2002, Nintendo realised they were unable to beat Microsoft and Sony on the same playing field. In 2006 Nintendo released the Wii. Although it had no CD, DVD support or HD graphics, what it did have was a new numb-chuck controller and a new target audience and demographic. Nintendo now targeted families as a whole instead with more family-orientated games proving successful with many families worldwide buying the console and a controller for each member of the family. This was the first time such a thing ever happened in the video game world.

wii_u

the Wii U is possibly a hint of whats to come regarding consoles in the future

6 years later Nintendo released the WiiU  with the same creative intent they had with their earlier console. Nintendo were beginning to get it right once again with a 64-Bit console featuring HD graphics, Better online support and for the first time ever, they now had a tablet based controller with touch capabilities that could also be used as an alternative screen if others wanted to watch television. Nintendo has finally entered the 64-Bit race and with both backward  hardware and software  support, once again making them a worthy competitor in the video games market.

Game Engines

Hardware contributes to the ability in which how technically advanced a game can be which is featured on a console. Both the Xbox and PlayStation have always been high-definition, 60 frame/per second machines ever since their releases. However, for the first time ever amongst consoles, due to the high technical capabilities, the graphics had not reached their full potential until in more recent years.

master_chief

above is the evolution of Halo’s master chief over the last 12 years

Video games are built on what we now know as engines. These engines dictate the graphics, controls, physics and most importantly the smoothness of the game. With the help of these technologically advanced devices developers have now began to start  making games which have more detailed graphics, rendering and textures. You can clearly see the difference in the games published for the Xbox or Playstation in 2006 and what is being produced today in 2013. But most importantly the biggest improvement in the games was the player’s desire to be more involved in games as they became more attached to stories which resulted in the constant improvement in player experience on the developers behalf.

Future of Video Games

The video game genre is now a pillar in media the same as movies and video are today. The result is the constant desire to improve presentation and experience. Although the Game engines have prolonged the longevity of video games consoles. Companies desire to improve on their console but not on the grounds of video games but overall media experiences and services like Netflix, Sport Channels and television in recent years.

One of the problems at the moment is that it’s the game engines support the visual graphics and that hardware has still  not yet been fully utilised. The question now is what will these new consoles bring to the table other than new multimedia experiences?

Currently the WiiU, Xbox 360 and the PS3 all run games at 60 frames per second meaning that for every second that passes 60 pictures flicker every second you play the game. This helps give games that smooth look. Perhaps the next consoles may feature integrated graphic capabilities that can support around 80 to a 100 frames per second meaning a visually smooth presentation on the players eyes.

Console manufactures no longer seem to focus on just video games now. Thanks to advancements in technology, these consoles have evolved into more than devices that just play games and into one overall multimedia centre that has made other technology and access to their media obsolete e.g. DVD players and physical disk rentals. Lifelong video game fans have no other choice than to accept the new generation of games console with media capabilities whether they wanted it in the first place or not.

We are now in a pivotal period of time once again in terms of video games . Perhaps the release of the new Xbox or Playstation may mark the death or retirement of video games consoles and the birth of the “media centre” since video games will only be one function out of many now.

References

Top 10 Video Games Consoles of All Time –  thegamescolsole.comhttp://www.thegameconsole.com/

Xbox – http://en.wikipedia.org/wiki/Xbox

Playstation 2 – http://en.wikipedia.org/wiki/PlayStation_2

Xbox360 http://en.wikipedia.org/wiki/Xbox_360

& http://www.xbox.com/en-US/xbox360/consoles/bundles/xbox360-250GB-bundle

Playstation 3http://en.wikipedia.org/wiki/PlayStation_3

& http://us.playstation.com/ps3/techspecs/assassin-s-creed-bundle.html

Nintendo Wii – http://en.wikipedia.org/wiki/Wii

Nintendo Wii U – http://en.wikipedia.org/wiki/WiiU

Master chief’s evolution – gears of halo – http://www.gearsofhalo.com/search/label/halo%204

 

Image References

Xbox360 – http://en.wikipedia.org/wiki/Xbox_360

Playstation 3 – http://en.wikipedia.org/wiki/PlayStation_3

Nintendo Wii – http://en.wikipedia.org/wiki/Wii

Nintendo Wii U – http://en.wikipedia.org/wiki/WiiU

The History of Video Games

Video games are now like another domestic appliance which you have in the home. Today, the new generation of children are born into households which already have a games console weather it is an in them. But 40 years ago, there was a pivotal point where video games were at a make or break moment.

pong_picture

pong: the acclaimed to be the very first video game ever made.

In 1972 Ted Bushnell, founder of Atari and with the help of Al Acorn, went on to design Pong months after the company was found. They two then fashioned a makeshift game cabinet after they rigged it up to a coin mechanism, the placed it in a local bar to test it and when they returned the next morning to find the coin box full and it marked the beginning of a new era in leisure & recreation.

atari_games_system

the AGS marked the birth of the console era as it was the first to enter homes

After a number of arcade games, Atari then developed the Atari Gaming System in 1977, reason being that arcades were beginning to be associated with anti-social and degenerate behaviour. This had marked the domestication of video games and a bright future.

nes_

the Nes proced some of the most iconic games and carachters of all time

By the 1980s a threat of a games recession began to show. But newcomers Nintendo released the Nintendo Entertainment System in 1985 holding the successful video game recipe not being in the technology but with games and iconic characters like Mario and Link(Legend of Zelda) who are today timeless protagonists in the video game world.

sega_genesis

the genesis helped sega beat Nintentedo in the bit race

In 1989, SEGA released a 16-Bit console called the Genesis.  Although nobody knew what a Bit was at the time, the only thing that people knew was more Bits were always better and when the Genesis was released it marked the beginning of a Bit War between the two companies with the 16-Bit Super Nintendo in 1992 and the 32-Bit SEGA 32X in 1994.

The war went on and on until electronics Giant Sony released the PlayStation in 1993. This was a leap forward as it was the first console to utilise the CD (compact disk) making games much easier and better to produce and also marked a step forward in home entertainment.

sony_playstation

the playstation as pictured above is the most successfull games console of all time

ps2_

the PS2 helped introduce the DVD into homes worldwide

Sony had gained so much momentum that even just speculation of the Playstation 2 kept a Technologically great console such as the SEGA Dreamcast from taking off.  In the year 2000 Sony released the Playstation 2, a 64-Bit games console with a not only a CD player but now a DVD player. Once again Sony had provided a superior console and introduced a new technology into households worldwide.

microsoft_xbox

the xbox was the first ever console that held a threat against Sony

Then Microsoft had entered the market that is. In 2001, Microsoft released the Xbox which could doeverything the Playstation could do and better. The Xbox ran on a 32-Bit CPU and held 10GB of memory with downloadable network capabilities. By 2002, it marked the death of past giants Nintendo and Sega and the passing of the torch to Sony and Microsoft and Sony for the next 10 years.

References

Top 10 Video Games Consoles of All Time –  thegamescolsole.comhttp://www.thegameconsole.com/

Pong – http://en.wikipedia.org/wiki/Pong

Nintendo Entertainment System –  http://en.wikipedia.org/wiki/NES

Super Nintendo Entertainment System – –  http://www.NES-Bit.com

SEGA Genesis – http://en.wikipedia.org/wiki/Sega_Genesis

SEGA Saturn – http://en.wikipedia.org/wiki/Sega_Saturn

Sony Playstation – http://en.wikipedia.org/wiki/PlayStation

Nintendo 64 – http://en.wikipedia.org/wiki/Nintendo_64

SEGA Dreamcast – http://en.wikipedia.org/wiki/Dreamcast

Playstation 2 – http://en.wikipedia.org/wiki/PlayStation_2

Microsoft Xbox –http://en.wikipedia.org/wiki/Xbox

Images references

Pong – http://en.wikipedia.org/wiki/Pong

Nintendo Entertainment System –  http://en.wikipedia.org/wiki/NES

Super Nintendo Entertainment System – –  http://www.NES-Bit.com

SEGA Genesis – http://commons.wikimedia.org/wiki/File:Sega-Genesis-Mod2-Set.jpg

SEGA Saturn – http://en.wikipedia.org/wiki/Sega_Saturn

Sony Playstation – http://jdanddiet.blogspot.ie/2010/11/sony-playstation-special-part-1.html

Playstation 2 – http://en.wikipedia.org/wiki/PlayStation_2

Microsoft Xbox – Microsoft Xbox -http://en.wikipedia.org/wiki/Xbox

 

History and Future of Multimedia on the web

History of Multimedia on the web

The name Multimedia is quite self explanatory. When broken down, the first part of the word multi means many and media is the collective that can relate to a whole host of creative mediums such as Music, Art and Sciences. In more recent times Multimedia is often used as an adjective to describe devices that support such things as photo’s, music and video playback in an effort to convey the many things that this wonderful new product is capable of.

However I think it’s fair to say that the popularity of the internet in the past 20 years has presented itself as a host for all the pre existing mediums as most Art and Music can easily be represented on a screen. Yet with this said it has actually brought it’s own form of media. One such form of media which is going through an evolution is that of journalism and writing in general. Blogs and social networks seemed to have swallowed that of the physical variety and both those forms exist solely online.

Photographers and movie makers have also had to adapt the principals of their profession to either harness or combat the power of the web. Photographers have seemed to faired better than the former as we see peer to peer sharing has become the casual norm of seeing the latest movie or buying a DVD.

When we look to the beginnings of the popularity of Multimedia on the web we have to take ourselves back to the mid 1990’s when internet connection speeds were at a snails pace in most places and server bandwidth was in small capacities. This resulted in very basic web designs and anything you saw moving would usually be animated gifs. Pictures were as exciting as it got when it came to media online. With dial up however it would take quite some time to fully download what would now be considered a small file in able to be viewed. Web standard technologies were in their infancy at the time so it was near impossible to substitute CSS for images because things like rounded corners ( to make circles or rounded rectangle buttons ) were not a part of CSS styling back then. If a designer wanted his website to look a certain way of have a particular trend, then custom elements on a page would have to be images. This resulted in the images on a web page being loaded after the content. In hindsight it was a horrible user experience but at the time there was nothing else to compare it to. This wasn’t a big deal though as with most new technology most people are skeptical and slow to adopt it and as a result there wasn’t many websites as there just wasn’t any demand as of yet.

Screen Shot 2013-03-06 at 16.25.55

Over time the web of course became more powerful and internet speeds quicker and pictures it seemed was the first form of media that had people buying desktop pc’s and browsing the web. It brought with it the first signs of overthrowing real world counter parts. By being able to take pictures and upload them to a website soon meant that people could meet other people online instead of going out to a bar or if they wanted to sell an old TV then all they had to do was put up a picture and add a contact number and that was that, no middle man needed unlike the pawn shops.

As the internet grew stronger and faster, so did it’s adoption rate. By the time music sharing became popular most young people or young families would have a household pc.  Many websites at this stage were being built in Flash so for the first time people were witnessing a true Multimedia experience as Flash websites tended to be more extravagant than the HTML versions. With Flash you could easily incorporate music, manipulate images and even have varies effects on the cursor. These kind of websites stood up to the name Flash, because that’s what they were, flashy. It was an important moment in modern web history as people saw what websites could be capable of. It wasnt too long before developers used Flash in a way that would open up an entirely new sub-genre to the masses and that was online games within the browser. Flash games in comparison to other gaming platforms such as Playstation and native PC were basic but could be done by a handful of people in a very short space of time where as the other much larger platforms would take much much longer. This area was lucrative too, nowadays Flash games take in ——– in revenue.

Not too long after in 2004? video began to make it noticeable impact in the way people consumed and browsed the internet. While it was

Future of Multimedia on the web

It’s very difficult to predict the how multimedia will be used on the web too far down the line as we also have to think about how the web is going to evolve as a host in the coming years. In the past 10 years we have seen the rise of the post PC era where smart phones and tablets have began to slowly take over and also we have seen a shift in the way Multimedia is portrayed to facilitate these devices. With smartphones we have seen a resurgence and emphasis on pictures as mobile apps such as Instagram and Twitter encourage people to share what’s around them.

It does seem clear however that we are slowly reducing the size of these devices and need for manual setup and labour. Everything is becoming more streamlined and instant. Everything is connected now. Google have taken a leap with their latest beta project, Google Glass. The glasses project is an attempt it would seem to bridge the gap between having to carry a device to actually having the technology almost embedded on you. Many old films and predictions from the 70’s and 80’s conjured images of people being able to control things with their minds or by just talking and something is listening to your commands. Even in Minority report where you can use gestures and entire body to control what you want to consume and see. It’s an attempt to make the body the device. In a way to yet again cut out the middle man. Even if the need for bodily interactions is nesscesary then to it should be designed in such a way that it really takes advantage of all the different manipulation techniques that we are able to carry out as humans. Ex-Apple User Interface designer Bret Victor has written about this in detail at http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/

If future technologies take advantage of this then it doesnt mean that the internet is going to go anywhere it just means the landscape for Multimedia on the web will look very different to what it is now. Pictures might have a whole new dimension, users may be able to peel pictures back with a peel like gesture. This in turn could breed a whole new wave of photography and set new principals of interface design akin to how the iPhone changed the standard for mobile design. By simply looking past the current standards of interaction and adapting to new undiscovered methods I think will breed the next age of Multimedia on the web.

History and Future of sound recording

History on sound recording blog

Did you ever think to yourself, how long has recorded sound and music been around? I’ve asked a few people and most people think about 80 to 100 years when in fact it’s been around for over 150 years, in 1857 the very first sound recording device recorded Au Clair De La Lune. This was recorded by the phonautograph, invented by the French inventor Édouard-Léon Scott de Martinville. Although the phonautograph could not play back sound, it scribbled the sound waves on to a piece of paper called a phonogram because Édouard-Léon wanted to study the sound waves, this technique used a bristle on a sheet of soot covered paper, the sound waves passing would cause the bristle to move. Only in 2008 were researches able to playback the sounds recorded on those phonograms.

Image

So who invented the first working device that could play back sound? Thomas Edison… he invented the Phonograph which used a lot of the ideas of the phonautograph, instead of a bristle sketching on to paper, the phonograph used a needle to indent the vibrations onto a tinfoil cylinder, this technique was completely mechanical, it recorded by turning a crank but it wasn’t the most user friendly device, it was difficult to operate except by experts and the tin foil worn away after playing the recording a few times.

Image

So that’s the story of how it began, just how did it progress? We went from mechanical recording to recording electronically with analogue.
It started in the mid 1920s with 78 rpm disks then moved on from vinyl onto cassette tapes, audio was recorded and played back using magnetic tape, this worked by arranging the magnetic dust on the tape in such a way the device used for playback could read it and change it into sound waves.

In modern recording techniques, everything is done digitally, recorded and edited digitally onto computers using audio editing software. It can be distributed without making hard copies, most music sold nowadays is through online purchases and downloads, making digital music the most profitable because there are no costs for distribution.

To get the analogue waveforms and transform them into binary code (digital files), we use a transducer to convert the sound wave, an example of a transducer is a microphone or the pickups of a guitar.

What makes digital and analogue quality different?

Image

Sound waves are smooth and rigid accordingly to the sound’s timbre and pitch, and digital sound waves aren’t as smooth or rigid as these waves where as analogue sound waves are perfectly recorded exactly as the sound wave was produced.
When recording or playing back digital audio, the sample rate picks out points of the sound wave and reproduces it from those points as shown in the image below.

Image

The higher the sample rate, the more points of the sound wave are picked out and a more accurate waveform is produced.

Digital recording is still the preferred method of recording due the fact that it is easier to edit, when editing sound on tape, it had to be cut by hand, different layers of sound weren’t as simple as drag and drop as they are on software used today.

Referencing

Image 1:
siouxfire, 27 March 2008, “Sound from Soot: The Oldest Recording” Viewed 27 February 2013
http://2.bp.blogspot.com/_cKiU2w48PDs/R-zWO3HwGGI/AAAAAAAACPk/VAq9rcqmzfM/s400/I_phonautograph.jpg

Image 2:
“The History of the Edison Cylinder Phonograph” viewed 27 February 2013
http://memory.loc.gov/ammem/edhtml/home.jpg

Image 3:
centerpointaudio.com, “Understanding the difference between Analogue and Digital Audio” viewed 27 February
http://www.centerpointaudio.com/Analog-VS-Digital.aspx

Image 4:
goodsires, “SampleRate.jpg”, viewed 27 February
http://s717.beta.photobucket.com/user/goodsires/media/SampleRate.jpg.html

Video:
ToneSpectra, 21 August 2010, “The first known recording of a human voice, from April 9th, 1860.”

http://www.youtube.com/watch?v=uBL7V3zGMUA

Wikipedia.org, “History of sound recording” viewed 13 December 2012
http://en.wikipedia.org/wiki/History_of_sound_recording

firstsounds.org, ‘Édouard-Léon Scott in his own words’, viewed 13 December 2012
http://www.firstsounds.org/features/scott.php

cylinders.library.ucsb.edu, ‘The Phonautograph and Precursors to Edison’s Phonograph’, viewed 13 December 2012
http://cylinders.library.ucsb.edu/history-early.php

inventors.about.com, “The Inventions of Thomas Edison”, viewed 13 December 2012

http://inventors.about.com/library/inventors/bledison.htm

recording-history.org, “Overview History of the Technologies for Recording Music and Sound”, viewed 13 December 2012
http://www.recording-history.org/HTML/musictech1.php

23 June 2012, Rev, “An Audio Timeline” viewed 13 December 2012

http://www.aes.org/aeshc/docs/audio.history.timeline.html

 

Future on sound recording blog

We are at a stage in digital recording where we can almost get perfect sound quality in our recordings, depending on the bit rate and frequency response, bit rate is how many times a second a sound file is converted from binary code into sound waves. We can record megabits per second but the result of such a high sample rate would cause a song to be a ridiculous file size, the maximum bite rate for the common audio format MP3, is 320 kilobits per second.

Most people cannot hear any difference when hearing audio at different bit rates so MP3 does well for most people, but I think MP3 will die out and a new higher quality sound format will come out at a higher bit rate and frequency response will take over. Memory for data storage on phones, computers and portable music players was very low years ago in comparison to today, in 2000 the average memory for computers was about 20GB, this was just over 10 years ago and nowadays we have multiple terabytes of memory at our disposal, so you can imagine how little memory music players had, more than likely most were about 500MB so lower quality lowered the file size so more songs could be stored and with the constant rise of memory for data storage, we can store more audio files with higher quality.

Sample rates are important to understand in the world of digital recording, they represent the sound frequency range, sound waves are smooth, and lower bit rates can’t represent this smooth waveform the same as a high bit rate can, the higher the bit rate, the higher the sound quality.

Image

I think sample rates are going to go up, the current standard for CD audio is 44.1kHz, CD is dying out and due to the memory size going up all the time, we’ll have space for high sample rates, just like bit rates.

The standard frequency response for equipment is 20 to 20,000Hz, Hz is how often the wavelengths occur per second, the less wavelengths the lower the pitch and the more, wavelengths the higher the pitch. MP3 cuts off any frequency lower than 20Hz or higher than 20,000Hz to save space, once again most people can’t hear any difference but it does effect the sound quality, some headphones and speakers offer the wider range of frequencies at 5 to 33,000Hz, these will probably become more available in later years when audio formats have more frequencies, it would be unnecessary to have this equipment while listening to MP3 format audio files because the frequencies have already been cut off.

 

The quality of sound just keeps getting better and better with time as we are constantly improving equipment and making room on hard drives for higher quality sound, if you listen to a song from 20 years ago or even 10 years ago, you can hear the quality difference, that difference is increasing rapidly.

Although digital recording is known for not having that same quality of sound as cassette tape or vinyl because they are analogue, I think one day it might even surpass it, analogue playback has that smooth sound wave but it’s harder to edit and you can do so much more with digital recording.

The price of recording equipment is going down all the time, many people have home studios these day, back before digital recording, people needed to pay money to rent a recording studio or get a record deal in order to record music. In recent years sound editing software has been released at an affordable price, so once people have their simple external sound card or mixer, they are ready to go, although this method of recording music doesn’t have as good sound quality as a recording studio, it’s still an effective way of recording music for an EP.
So, where is this going?
Record companies are barely needed anymore, they are dying out and recently it went from The Big Four to The Big Three, music stores like HMV are closing down, I am guessing that they are all going to die out, both record companies and music stores because they aren’t needed anymore.
With the equipment becoming more and more affordable, people are going to record at home and just hire sound engineers, this means music is all going to be sold online, if someone wants to buy a CD or vinyl, they’ll have to order online unless there are some music shops still laying around in their local area, people like having collectables so cities will always have at least one music store.

In short, music files will get bigger but with better quality to show for it, people will record at home in their own recording studios and music stores will become more scarce.

Referencing

Image 1:
labtronix.co.uk, “About Oscilloscope Sample Rate”, viewed 27 February 2013
http://labtronix.co.uk/drupal/content/about-oscilloscope-sample-rate

Jay Jennings, 23 January 2012, “What is the future of recording technologies?” viewed 13 February
http://socialsounddesign.com/questions/12331/what-is-the-future-of-recording-technologies

Wikipedia.org, “Music Industry”, viewed 13 February
http://en.wikipedia.org/wiki/Music_industry

Thecanadianencyclopedia.com, “Recorded sound technology and its impact”, viewed 14 February 2013
http://www.thecanadianencyclopedia.com/articles/emc/recorded-sound-technology-and-its-impact

 

Graphic Tablets & Mind Reading Computers

Graphic Tablets

Introduction

When talking about a graphics tablet, most people would associate it with someone doing digital illustration. Few people realize that digital animators, AutoCAD users and digital photographers to name a few also heavily use it [1]. In parts of Asia, graphics tablets are used to write Chinese, Japanese and Korean characters[2].

But what is a graphics tablet for someone who has not encountered them? Graphics tablets are input devices similar to a keyboard and mouse. Instead of inputting typing or clicking the tablet acts as a digital canvas, upon which you can draw, just like on paper with a pencil or pen. The tablet translates the image into data on a computer [2][3].

History

Elisha Gray first created graphics tablets in 1888. The device was named a Telautograph and worked by transmitting a signature or drawing from one device to another over telegraph. It wasn’t until 1957 until a device that resembled modern graphics tablets was produced in the form of the ‘Stylator’. This worked via acoustics. The stylus equipped with a sparkplug, produced clicking noises when it touched the pad. This noise was picked up and triangulated by a series of microphones to pinpoint the location of the stylus in space. Issues with this method were that it was susceptible to interference by outside noises and expensive. In 1964 the RAND graphics tablet was introduced with technology that is the basis of modern graphics tablets. The Rand employed a series of wires laid in grids under the surface of a pad. These grids encoded horizontal and vertical coordinates via electronic signals. For the RAND tablet, the stylus would receive the signals and decode it and send it back as information, other tablets use varying methods similar to this [4][5][6][7].

How it works

The company Wacom holds many of the key patents on graphics tablets forcing competitors to devise different technologies or pay Wacom license fees [1][8].  This resulted in different technologies for graphics tablets being developed.

Passive technology

Passive tablets make use of electromagnetic induction technology between the tablet and stylus [8][9].

Screen Shot 2013-03-06 at 15.58.38

Active technology

Active tablets work much in the same way as passive tablets except they use a battery operated stylus that transmits a constant signal to the tablet instead of using electromagnetic fields [9].

Screen Shot 2013-03-06 at 15.58.43

Optical technology

Optical tablets use a digital camera inside the stylus which matches the pattern on the specially printed paper it is being used on locate its position on the paper [9][10].

Screen Shot 2013-03-06 at 15.58.53

Capacitive technology

Capacitive technology is widely used by mobile devices such as touchscreen mobile phones and tablet computers. It relys on the electricity produced by the human body to detect where and when on the display the user is touching. This means that a user can use the tablet without using a stylus or gloved hand, but rather light touches of a finger. Recent development of this technology has enabled gestures to be used on the tablet involving multiple fingers. On other tablet technologies a user is restricted to one point of contact at a time [9][11][12].

Screen Shot 2013-03-06 at 15.59.09

Conclusion

With graphic tablet technology getting cheaper it has opened the market up to the average user instead of the serious illustrator. People are beginning to use graphic tablets more for general computer usage as they reduce the risk of repetitive strain injury and seem more natural to people.

Bibliography

  1. Before You Buy a Graphics Tablet – Sue Chastain (graphicssoft.about.com/od/aboutgraphics/a/graphicstablets.htm)
  2. What is pen tablet? – huion (huion-tablet.com/about/press.php?id=13)
  3. What Is a Graphic Tablet? – Beth Bartlett (ehow.com/about_5181024_graphic-tablet_.html)
  4. The Telautograph – Jeremy Norman (historyofinformation.com/expanded.php?id=3270)
  5. The Technological Development of Graphics Tablet through History – Shin (xja1.wordpress.com/2009/03/17/the-technological-development-of-graphics-tablet-through-history/)
  6. The complete guide to digital illustration – Caplin, S, Banks, A & Holmes, N. (books.google.ie)
  7. Tablets: Sought by Nobody, Hyped by Everybody – Thom Holwerda (osnews.com/story/22287/Tablets_Sought_by_Nobody_Hyped_by_Everybody)
  8. EMR® (Electro-Magnetic Resonance) Technology – Wacom (www.wacom-components.com/english/technology/emr.html)
  9. Learn how drawing tablets work and why they can benefit your drawing – Vincenzo (drawing-factory.com/how-drawing-tablets-work.html)
  10. The Technology – Anoto (anoto.com/lng/en/pageTag/page:products/mode/view/documentId/1001/)
  11. History of Pen and Gesture Computing: Annotated Bibliography in On-line Character Recognition, Pen Computing, Gesture User Interfaces and Tablet and Touch Computer –  jrward (users.erols.com/rwservices/pens/biblio85.html#Scriptel84)
  12. United States Patent US4600807 – Kable (freepatentsonline.com/4600807.pdf)

 

Mind Reading Computers

Introduction

Historic advances in technology are occurring continually with new discoveries and devices being created that previously were thought to be regulated to the realms of science fiction a mere ten years ago. While it is not unheard of for ideas devised in science fiction to translate into real science, Star Trek has birthed countless ideas and inspiration for scientists over the years from handheld communicators [1][2] (mobile phones) to interstellar travel via warp drive (currently entering prototype phase by NASA) [2][3]. The concept of computers interacting with people’s minds directly and reading them is now quickly emerging from science fiction to reality.

Already there are simple devices, which can do this in a limited capacity such as the ‘Star Wars Force Trainer’ [4][5]. Which uses EEG headsets to control the speed of a fan that floats a plastic ball at varying degrees through a tube.

Screen Shot 2013-03-06 at 16.03.35

More advanced technology such as ‘EPOC’ by Emotiv[6][7], allows a user to interact with a computer on a basic level by moving objects across the computer screen and selecting specific items. It is tomorrow’s versions of this technology that will achieve complete mind to computer interaction that is currently being researched and beginning to bear fruit.

This groundbreaking work is being performed by a handful of neuroscientists across the globe. Both private businesses such as Intel [8] and IBM [9] and research labs such as UC Berkley [10][11][12][13] and Japan’s ATR Computational Neuroscience Labs [14] are making advances in different areas of this new frontier.

These groups have been performing research for the last ten years and are now each producing their own results, so much so that IBM in their annual ‘5 in 5’ [15] report announced that in 5 years time, the keyboard and mouse would be obsolete and people would be able to control computers with their minds, navigating the internet aswell as writing and sending emails with a thought.

This is achieved by the computer being able to read a person’s thoughts directly rather than though keyboards and mice.

To understand how the technology works, first imagine the visual cortex is like a camera. It takes snapshots of what it sees and registers this information with the rest of the brain. This technology deciphers this indexing within the brain via detailed scans of the brain so it can reconstruct the image [10][12][13]. This is possible because the brain is basically just a bunch of neurons firing in sequences. The visual cortex alone has an estimated 300 million neurons firing at any one time. So it is a case of examining and understanding this.

Currently this is performed by using a Functional Magnetic Resonance Imaging machine (fMRI). A subject is shown thousands of random images as the fMRI takes readings of how the brain responds to each image. These readings are analysed and certain matching brain activity is attributed to details such as shape and colour. Eventually the computer compiles all of this data and creates a master decryption key that is used to identify and reconstruct almost any object the subject sees without the need to analyse the image beforehand. Currently the evaluation and recreation takes hours, but it is hoped to speed it up to the point of where the computer can read a mind in real time [10][11][12][13].

If this technology does become more streamlined and the possibility of its widespread use becomes a reality, what will it be used for and what ethics issues will it raise? The different researchers have differing ideas of how it would be used. Neuroscientist and head researcher, Jack Gallant in UC Berkley sees a wide range of uses for the technology, mainly medical. He envisions the technology being used to help with studies into Dream analysis, in the practice of psychotherapy and for communicating with people who cannot normally communicate such as ‘locked in’ patients or people with neurological diseases.

ATR Computional Neuroscience sees potential for Artists to express their creative imaginations more easily and without any loss in translation via lack of expertise with drawing. Both Intel and IBM researchers hope that the technology will create the ultimate interface between people and everyday devices like phones and computers.

Whichever the case it will certainly bring technology into a closer bond with people and society. With advanced enough forms of the technology Judges could use machines to look into a suspects mind and see their involvement or innocence of a crime, the insanity plea could be found to be fake or real. However there are large ethics issues such as the fears of people’s private thoughts being invaded by government agencies or 3rd parties such as hackers or companies.

But for now we can only speculate and await the development of this technology into mainstream society and hope that it is not abused and that people will simply use their heads.

Bibliography

  1. Engineers of the Future Design Star Trek-Inspired Tricorder Device – Marissa Fessenden (blogs.scientificamerican.com/observations/2012/11/12/engineers-of-the-future-design-star-trek-inspired-tricorder-device/)
  2. Star Trek technology: how 21st century scientists are making it so – Corrinne Burns (www.guardian.co.uk/science/blog/2012/oct/19/star-trek-technology-scientists)
  3. Alcubierre drive – Ana Sayfa (www.zamandayolculuk.com/cetinbal/HTMLdosya1/AlcubierreWarpDrive2.htm)
  4. Star wars force Trainer (starwars.com/shop/toys/the_force_trainer/)
  5. Star Wars Force Trainer in action – Darren Quick (gizmag.com/star-wars-force-trainer/11304/)
  6. Epoc Features (www.emotiv.com/epoc/)
  7. How the Emotiv EPOC Works – Jane McGrath (electronics.howstuffworks.com/emotiv-epoc.htm)
  8. Computers that read minds are being developed by Intel –  Richard Gray (telegraph.co.uk/technology/news/7957664/Computers-that-read-minds-are-being-developed-by-Intel.html)
  9. Beyond machines – Dharmendra S Modha (ibm.com/smarterplanet/us/en/business_analytics/article/cognitive_computing.html)
  10. The quest to read the human mind – Lisa Katayama (popsci.com/science/article/2010-01/mind-readers)
  11. Mind-Reading Tech Reconstructs Videos From Brain Images –  Dan Nosowitz (popsci.com/science/article/2011-09/mind-reading-tech-reconstructs-videos-brain-images)
  12. Scientists Reconstruct Brains’ Visions Into Digital Video In Historic Experiment – Jesus Diaz (gizmodo.com/5843117/scientists-reconstruct-video-clips-from-brain-activity)
  13. Scientists use brain imaging to reveal the movies in our mind – Yasmin Anwar (newscenter.berkeley.edu/2011/09/22/brain-movies/)
  14. Scientists extract images directly from brain – Maywa Denki (pinktentacle.com/2008/12/scientists-extract-images-directly-from-brain/)
  15. IBM: ‘Your PC will read your mind by 2016’ – Iain Thomson (theregister.co.uk/2011/12/20/ibm_five_future_technology/)