H9832B Duck

Warning: this post is almost the same than Mohammed’s due to we’re working together on our final project.

H9832B Duck is our project’s name. H9832B Duck is a robot that obeys commands sent from Twitter. I explained the general idea on my last post. In order to make it work, we had to face to some technological and mechanical challenges. I’m talking about technological and programming issues and Mohammed will about mechanics.


Let’s start by introducing all the components of the project:

  • Duck

Fig.01 Duck

  • LCD screen

Fig.02 LCD

  • Arduino

Fig.03 Arduino

  • Sensor shield v4
Sensor shield v4

Fig.04 Sensor shield v4

  • Laptop

Fig.05 Laptop

According to the architecture of my last post, this project has two main modules. The first module deals with sending the tweet from twitter to the computer. The second one is responsible for execution of the command in Arduino.

Just as a reminder, here you are the architecture again (see my last post):


Twitter and Python

The first step is to send a tweet to the duck. For this reason, we created its own account on twitter: @H9832B. Like every user, @H9832B follows others. In this case, “following” means “I accept your commands”. Every user can send tweets to the duck (of course), but it only obeys those whom it follows. Following is the way to allow to other users to command the duck. The following ones needn’t to be followers. The duck also can command itself.


Fig.06 Twitter

The next step is to send the tweet to the computer. Actually, twitter doesn’t send anything to the computer. It’s the computer who searches for new tweets. We achieved this programming a script in Python. Basically, this script:

  1. Accesses @H9832B’s twitter account.
  2. Gets its friends (followings) including itself (@H9832B).
  3. For each friend, gets their last tweet.
  4. Filters those tweets that are commands (they have an specific syntax).
  5. Chooses the first one.
  6. Sends that command to the serial port (where it’s expected to be Arduino).

Normally, signing in a twitter account requires that a person (the owner of the account or his/her boyfriend/girlfriend) accesses to twitter website through a web browser and inserts their username and password manually. Can Python do that by itself? Of course it can’t. But there is a method that let automate this process. Firstly, we had to go to twitter developers web and sign in (with @H9832B’s account). Then, we created an application by giving a name for it and a description. Once we created our application, we got the information for OAuth. OAuth is an authentication protocol that allows users to approve application to act on their behalf without sharing their password. Specifically, the information required is: Consumer Key, Consumer Secret, Access Token and Access Token Secret. That information is supplied by twitter in the application itself.

Twitter developers

Fig.07 Twitter developers

After creating our application and getting the relevant information, we installed the python-twitter library on the computer. This library provides a pure Python interface for the Twitter API. This way, we can performance operations by using just python code. A complete description of this API can be found here. Some of the instructions we use are:

  • import twitter //imports the python-twitter library
  • api = twitter.Api(consumer_key,
                      access_token_secret) //creates the connection
  • api.GetFriends() //gets followings
  • api.GetUserTimeline(username) //gets a user's timeline

To send a command to the duck, the user must write a tweet with the next format: “… @H9832B … $command$ …”, i.e., the tweet must be sent to “@H9832B” and the command written between “$”. Currently, the duck can performance two operations and it understands three commands: “wings” to move its wings, “head” to move its head and beak, and “everything” to move its wings, head and beak. The python script checks every 10 seconds for new tweets. If there is a new tweet, the command (the text between “$”) is extracted and sent to the serial port, where the Arduino is supposed to be. The information sent to Arduino has the next format: “@user -> command”.


Fig.08 Commands

The other important issue concerned to the python script is its communication with Arduino. Talking to Arduino over a serial interface is pretty trivial in Python. There is a wrapper library called pySerial that works really well. This module encapsulates the access for the serial port. The data transmision with the serial port is made byte by byte, so, to send a whole text, it must be sent character by character. The methods we use are:

  • import serial //imports the pySerial library
  • arduino = serial.Serial('/dev/ttyACM0',
                              timeout=1) //creates the connection with serial port
  • arduino.write(character)
    //writes a character to the serial port (or, in other words, sends a character to Arduino)

Arduino and the duck

As a command is sent to Arduino, two things can happen: if the command is known, the duck obeys the instruction and the “message “@user -> command” is displayed on the screen; if not, the duck does nothing and the screen displays the message “I don’t understand”. Besides, the duck blinks periodically. This behaviour is possible thanks to all the electronic components involved. There are two LEDs (for its eyes) and two Servo motors (one for its wings and another one for its head and beak) inside the duck, and an Arduino and a Sensor shield to control those components. But a proper programme running in Arduino is also required.

The duck and the LCD

Fig.09 The duck and the LCD

Arduino is programmed in a language called Wired. This programme is coded in the IDE Arduino you must previously install on your computer. When the programme is done, it’s sent and run in Arduino. This programme manages every component in our project.


Fig.10 Programming

First of all, we needed to set up every component: pins, positions, etc. I’d like to clarify that we don’t connect the components to Arduino directly, we use the shield instead. Everytime I say “connected to Arduino”, I’ll mean “connected to the shield”. We have two LEDs connected to two pins in Arduino, two servo motors connected to other two pins and a LCD screen connected to the BUS1 on the shield. It’s very important to check that pins used by the BUS don’t overlap with the pins used by the LEDs and the motors. Since the motors have rotating movements, it’s needed to set up their positions in angles (degrees). So it’s necessary to give the initial and final angles of the motion. We also defined a special counters for the blinking of the duck’s eyes and some constants for the screen: number of rows, number of columns, speed of scrolling, etc.

Secondly, the programme starts with the setup() method. The project configuration is initialized here:

  • We set up pin modes and initial values (HIGH) for turning LEDs on and attached the motors to their corresponding pins.
  • We created the connection with serial port (from where the computer is sending the commands).
  • We initialized the lcd screen: size (2 rows and 16 columns) and initial message (“Hi, I’m H9832B Duck”).

Finally, the loop() method runs once and again all the time. The loop() is the heart of the programme and its code controls the behaviour of the duck and the screen. Specifically:

  • It checks for new commands.
  • It makes the duck blink if applicable.
  • It scrolls the screen.


We wished that messages on screen appeared on the bottom row, from right to left. When it started to disappear on the left, it’d started to appear on the top row, also from right to left. And when it disppeared completely on the left, it’d appear on the bottom row and process started again. The LCD library includes its own scroll() method, but its scrolls each row independently, so it didn’t offer the behaviour we needed.

Our own screen_scroll_next() function doesn’t scroll the screen actually. It just print different messages once and again and creating the effect of scrolling. To exemplify, imagine a screen with 1 row, 6 columns and the message “hello”. We just print the next sequence of messages:

01: _ _ _ _ _ _
02: _ _ _ _ _ h
03: _ _ _ _ h e
04: _ _ _ h e l
05: _ _ h e l l
06: _ h e l l o
07: h e l l o _
08: e l l o _ _
09: l l o _ _ _
10: l o _ _ _ _
11: o _ _ _ _ _
01: _ _ _ _ _ _

The effect is that the message “hello” appears on the rigth and dissapears on the left. Simple but effective! The message “scrolls” one character for each step on the loop.

The screen always show the same message until an event makes it switch. In the beginning, the screen shows “Hi, I’m H9832B Duck”. When a command is received, the message shown is the last user and command executed with the format “@user -> command”. If the command was unknown, the message is “I don’t understand you”.


Fig.11 LEDs


Once a (aproximately) 5 seconds, the duck blinks. We have a special counter that starts on 0 and increments by 400 (0.4 seconds )every step of the loop(). When that counter reaches 4800, the duck blinks once; when it reaches 9600, the duck blinks once; and when it reaches 10000, the duck blinks twice. Then, the counter restarts on 0 again. This sequence creates a non-uniform and natural blinking. We’re aware that the red color gives a psycho-killer look to our robot.

Servo motor

Fig.12 Servo motor


Every step on the loop, the programme checks for new commands. That means, checks whether new information where sent from the serial port. In other words, whether there is new data available or not. Because of the data transmision with the serial port is made character by character, we coded a specific function that basically read the serial port character by character and put them together to create a whole message. The python script takes care of sending messages with the format “@user -> command”, but its the wired programme who must check that the “command” is known for the duck. So, we codified another function called getInstruction(message) to extract the right side of the “->”. If the instruction obtained is known, the duck will do the corresponding action. If not, it’ll do nothing and the screen will show the message “I don’ understand”.

What happens if the duck understand the instruction? It can performance two actions: move wings and move head and beak. What we really do is to write on the servo motors new values of angle. The motors have a repose angle (set up on the setup() method) and when the command is activated, this value are modified and a little piece of the motor rotates and goes to the new position. This new position is held on the necessary time and then, the original value of the angle is restored. Likely, there are more than one repetition. For example, the duck moves its wings three times. That means, the programme on Arduino switched the initial and final value of the angle three times.

When motors move, their little piece move the wings and the head. This process is explained below, in the second part of the post.

Before and after

Fig.13 Before and after

We recorded two videos: the first one is a testing of the duck and the second one is the final result of our creature.

There are three small projects which helped us with ours. Here you are their links in case you are interested:

  1. Tweet-a-Pot: Twitter Enabled Coffee Pot
  2. simpleTweet_01 python
  3. Send a Tweet to Your Office Door: And let your coworkers know what you’re up to


As part of the previous mentioned robot H9832B explanation, we must admit, the process of cleaning inside the duck: removing all cables, the motor, the speaker and the batteries was a lot of fun. The process of destroying and stripping our doll off, gave a different value to the project, in other words, the idea of building a robot by the deconstruction of a small doll was fun. We had a lot of action along with the creation of the project.

To clarify what our H9832B project is; we relied on a scheme that was drawn on by our Professor William Turkel on 10/febrauary/2012. http://williamjturkel.net/


Fig.14 Scheme

After Antonio and I talked about creating a robot that could be controlled by an Arduino platform, we went to talk to Professor William Turkel that ended up drawing the scheme of how it could be our design.

It was about being able to control the movements of the robot; in this case it is our doll, called H9832B duck, (Figure 1, to the right side of the image) by using a laptop which receives commands through internet, that is, through the application Twitter (Figure 1, to the left side of the image).

We wanted to control the robot with several engines (at the same time), in our case the “9 GR Servo”, and to do that, we started using a “Micro Mastro #1350, Battery”. This control of the six engines was done from its connection with Micro Maestro, and an external energy and programming system called Pololu Master Control Center.

Testing Servo motors

Fig.15 Testing Servo motors

After much time spent on understanding how to manage and program the “servos”, we realized that we really did not need to use Micro Maestro, because we could manage our H9832B duck by only using two “servos”. What really led us to use Micro Maestro was that we thought we needed more energy to move several engines which Arduino could feed. The truth is that it was a good experience in the sense that we learned how to move six servos at the same time and with different programming systems. Now that we have more knowledge, we can apply it for other future projects that we want to do.

Here you can download “Pololu Maestro Control” http://www.pololu.com/docs/0J40/all

All decisions we have been taking so far have been due to the ongoing work that my colleague, Antonio and I , have been investing throughout this course. That means, while we were working with the programming systems of the various programs we were also working on the construction of the robot.

As we said before, we decided to use only two servos that were able to move all parts of the robot. Our first servo is able to move the two sides of the wings and the second servo is responsible for movement of the head and beak at the same time. We must also say that we have replaced the original duck’s eyes with two red LED lights, to give it a machine look.

The process of placing the engines and lights has also been fun. For the engines we have been able to attach them so that they fit well into the interior of the duck body by using threads and also, so they can rotate freely inside the body. Before the final placement of the engines we did many tests to make sure that they could move the parts we wanted. Each motor has a different movement, that is to say, we have programmed the angles according to fair movements without breaking anything. This work was not easy, particularly, the movement of the head for several reasons. We had to put two LEDS inside the head’s shell, one for each eye and this meant, looking for the perfect space for them to be able to move freely so that the parts would not affect the cables.

The duck's guts

Fig.16 The duck's guts

The problem was we did not have much space inside the head’s shell as it was too small and also had a structural system that allowed to open and close its beak. Inside the head and after many attempts, we were able to place the two LEDS with their wires to feed the lights with energy. These two wires go out of the head and at the same time allow their rotary motion.

The duck's brains

Fig.17 The duck's brains

After securing the light´s operation and the mechanism of rotational movement, we jumped to the next step, which was to tie the head to a servo by using a thread. This thread we could not tie it directly to the servo, because if we did, the servo would not be able to move the head. We had to tie the thread to the head, and then, pass the thread through a fixed part that existed already and then attach the wire to the motor; in other words we create a pulley system. At this point, and after seeing our system working, we started laughing because we were so happy we had succeeded.


Fig.18 Pulleys

At this moment we had fixed a side wing to the one servo by a thread. This part was easier because we had more space inside the duck and therefore, the servo movement was not difficult. Tying the other wing was the last thing we wanted to do since that meant the complete closure of the entire body. After quite a bit of hard work and many programming errors, we finally could synchronize the different movements of the head, beak and wings. All this was done by connecting the various wires coming out of the duck into the Electronic Chassis Brick, and the Arduino UNO, while the Arduino UNO was connected to our laptop. The programming system that we use to speak from the laptop to the Arduino UNO is called WIRED. Here you can download the software: http://arduino.cc/hu/Main/Software

Finally, everything was fine and worked perfectly. At this time we were happy and therefore, we were seeking to bind our other wing and close the entire body. We started to check everything. Before finalizing, we have to tell you another important element of our project.

All we are doing with the programming systems is to control the movements of our robot through a Tweet that is sent to the account we have developed since the beginning of the project, this account is @H9832B and the commands are:

  • @H9832B $wings$: upon receipt of this command to our account on Twitter the duck flaps his wings three times as we have programmed.

  • @H9832B $head$: upon receipt of this command to our account on Twitter the duck moves his head and opens-closes his beak twice as we have programmed.

  • @H9832B $everything$: upon receipt of this command to our account on Twitter the duck first moves the head and opens/closes his beak twice and then moves his wings three times as we have ,as well, programmed.

Regarding the lights we decided they should always be on, but in a blinking light mode.

But how can we know who has sent a tweet to @H9832B?

Since we don’t want to open our Twitter account all the time, to see who sent the orders mentioned above; we decided to use an LCD screen that we have been able to program. Thus, whenever we have an order for our H9832B duck, our LCD is able to read and show the information about the user that sent the order and what type of order was it.

The duck posing for this blog

Fig.19 The duck posing for this blog

The LCD screen was always going to be a part of our project, even before we switched to this one, as was mentioned in the previous post.

http://mafana.wordpress.com/2012/03/10/what-is-the next/?preview=true&preview_id=188&preview_nonce=b0abd6484d.

What is interesting here is that we have achieved programming the LCD screen to read long messages by making the messages flow in motion.

Now, back to what we were saying when we started to check everything; that is, the movements of our H9832B duck along with the LCD screen. So… here comes the moment of truth… the result was disappointing. We came down and there was an uncomfortable moment of silence, basically, it did not work. Soon, I began to think that for every problem there is a solution, and the truth is, that having an IT man, such as Antonio, as my colleague made the problem less severe. This is because, he always has a solution to all problems, and therefore it was good having him as my co-worker.

The problem that arose was that the screen did not work well; it was blinking too fast, leaving strange symbols and we did not understand what was happening. We thought that by having all items plugged in at the same time, there was not enough energy for everything. Quickly, we contacted Professor William Turkel asking for another Arduino UNO, and by which, perhaps having an Arduino for H9832B duck and another one for the LCD screen. It was a very stressful time of reflection.

What can we do now? It was Friday and there are only few days left for the final delivery and we could not afford to lose time.

But then of course, Antonio saved our project again. The problem was that our LCD screen was plugged into the Electronic Brick Chassis BUS2. This BUS2 was sharing a pin with D9, D10, D11, and D12, which where, were the robot’s wires were plugged in. So we decided to change the spot of the LCD screen from the BUS2 to BUS1, where it did not involve the pin robot cables. It was a very intense moment, seconds seemed minutes, and at this very moment, there was another surprise waiting for us but in the positive sense.

Finally, our H9832B duck was moving accordingly to the commands and the LCD screen was showing the messages perfectly. So yes, there was joy again!

Posted in Uncategorized | 2 Comments

Our project: final idea

When we presented our idea to Bill, we weren’t aware of its hardness (actually, easyness). We still think our original idea was a good idea, but, to be honest, it doesn’t display fireworks. Tactful Bill kindly suggested the chance of moving “something”, a monkey or something like that. Movement is always more spectacular and we gladly accepted.

Another change was to keep the provisional computer on the project. Mobile plans can be expensive, so Bill had the idea of moving the “thing” through the Internet, specifically Twitter. Options opened due to we could use any device connected to the Internet: a computer, a mobile, a tablet and so on.


Fig.01 Duck

The last question was: “what can we move?” I live near the Thames river. One day, on my way home I came across a little duck. I caught it and we’re upgrading it now.

Then the final idea would be to send commands to the duck from Twitter. Commands such as “flutter”, “move your head”, “speak”… As we had a lot of work done with regard to the LCD, we thought it would be useful to display some extra information. The flow of data is slightly different now:


Fig.02 Architecture

Further technical details will be explained in the next posts.

Posted in Uncategorized | 1 Comment

Our project: original idea

Mohammed and me were thinking in an idea for our final project. We had to apply our just acquired knowledge in Arduino to some idea. Because of our recent success with the liquid cristal screen, we thought we may combine arduino and the LCD to show some useful information on it. Yes, we failed first time with LCD but succeed a week later. We did the “Hello World!” exercise and played with scrolls.

Then, Mohammed had a great idea! “Why don’t we use a LCD attached to a door to show messages?” We inmediately thought in a teacher’s office. The teacher could send messages like “I’m late” or “I’m not going to work today”.


Fig.01 LCD

Our original idea was to connect a fixed mobile to arduino and send messages from another mobile. But first of all, we decided to replace the fixed mobile by a computer in order to check things worked. So we divided our project in two main parts: the side from mobile (from which the message is sent) to the computer and the side from the arduino to the screen. For one hand, we had to be able to scroll long messages on the LCD. On the other hand, we had to achieve to send messages from the mobile to the computer. Therefore, the complete cycle would be: mobile, computer, arduino, LCD.


Fig.02 Architecture

However, I won’t deep in this idea because this won’t be the final project.

Posted in Uncategorized | Leave a comment

Arduino and other things

I’ve learnt a lot about Arduino since my last post. Last week, Mohammed and me used Inkscape. It was fun to create draws and shapes. We didn’t take any pic, but what we did it’s not far from this:


Fig.01 Inkscape

We played with lines, colors and transparencies. I know the difficulty of programming these kind of softwares. When I studied Computer Science in Spain, I took a course called Graphical Representation Techniques by Computer. Our project consisted in programming a 2D CAD tool. We had to simulate a screen with thick pixels (they where a small squares of real pixels) and the menu. Although it was 9 years ago, I remember perfectly the case of the digitilization of the straight line: start point, end point, formula of straight line and… light on appropiate pixels. I wish I had my programme with me. It was something like that (indeed it was much better!) (by the way, I created this image with Inkscape):

Digitalization of straight line

Fig.02 Digitalization of straight line

We are aware of the importance of knowing these tools. It could be really useful in a presentation (we call it exhibition!).

Tomorrow is a promising class because we’re going to see 3D representations. I’m lucky to count with the likely student that best know about 3D models on campus. We’ll see.

Two weeks ago, we learnt a little about Processing. This programming language is very similar to Wiring (that that runs on Arduino) but this is executed on the computer. We mandatory need a language like this for our project. Instead of Processing, we chose python because is more familiar to me and fits our purpose. But our project deserves another post.

Posted in Uncategorized | Leave a comment

Arduino 2.0

Today, we had our second class of Interactive Exhibit Design and we were playing with Arduino again. Last wednesday, we programmed Arduino by transfering the code from the computer to the Arduino and we got fun results. But we introduced an important difference today: we transmitted data from Arduino to the computer! That means the flow of information between the computer and the Arduino is bidirectional. This opens up a world of possibilities.

In this class, we did three exercises. The first was Digital Read Serial. In this exercise, we used a button as input and the computer as output. Arduino read constantly the state of the button and printed out the corresponding value (0=OFF, 1=ON) on the computer’s screen.

Digital Read Serial

Fig.01 Digital Read Serial

The second was Analog Read Serial. The idea is similar to the digital version but, in this case, we had a knob as an input instead of a button. According to the knob, values on screen varied from 0 to 1023. The reason is that we doesn’t have two discrete values (ON and OFF), but a continuous range.

Analog Read Serial

Fig.02 Analog Read Serial

For the third exercise, we used an electronic brick connected to the Arduino board. This brick worked as an interface for Arduino. So it was easy to perform the first example of the course 0 of Seeed Studio Works, Electronic Bricks Cookbook, volume 1. The result is that the LED lighted whenever we pressed the button.

Button (Electronic Brick version)

Fig.03 Button (Electronic Brick version)

Unfortunately, we failed with the liquid cristal screen when we tried a more complex exercise and we didn’t have enough time to fix it. We’ll see next week.

Mohammed and me decided to form a group and work together in this course. Although we still need to learn a lot, we believe we are ready to think some interesting project. We’ll do this in these days and we’d like to be on it next class.

Posted in Uncategorized | 1 Comment

Arduino 1.0

I had my first religious experience with Arduino last week, and I shared it with my friend Mohammed and a nice girl called Cynthia. Arduino is more than a simple board. It’s “an open-source electronics prototyping platform based on flexible, easy-to-use hardware and software.” [1]

This hardware contains a microcontroller that is able to do tasks if you programme it properly, exactly the same as a minicomputer. Its inputs and outputs let the user interact with it.


Fig.01 Arduino

Gone are the days when you had to use a signal generator to supply information and observe the results on a oscilloscope‘s radioactive-green screen. [2] And, yes, I had to learn how it worked in order to do my practices of the course Physics when I studied Computer Science.


Fig.02 Oscilloscope

Due to my background, I found fantastic Arduino. I couldn’t believe how easy was to programme it and to check the results! The only thing you need is a computer with the Arduino IDE installed where you write your program (or copy and paste), connect it to Arduino platform and play!

Arduino IDE snapshot

Fig.03 Arduino IDE snapshot

We did three exercises in class. The first was Blink. We were playing with delays and watched the led on and off. The second was Button. As we were children, we enjoyed a lot pressing the button once and again to turn on and off the led! The third one was Button State Change. We could identify easily the modulo function and change the number of button pushes necessary to light on and off the led.

I wish I had had my camera with me. I would have taken some pics of our progress. I won’t forget it next class! I borrowed this pic from the Internet.


Fig.04 Led

[1] http://arduino.cc/

Posted in Uncategorized | 1 Comment

The magic bag

Every morning I have to prepare my bag for a long day: classes, job, gym… That means I need a big bag to hold all kind of things. I love my sports bag, I bought it after arriving in London.

Fig.01 Sports bag

But my cool bag became more and more heavy as I keep my stuff: my laptop,

Fig.02 Laptop

my English-Spanish dictionary,

Fig.03 Dictionary

my sport clothes,

Fig.04 Sport clothes

my meal,

Fig.05 Meal

my class books…

Fig.06 Class books

Yeah, it’s hard to carry so many things at once! I wish my bag were like Mary Poppins’. I couldn’t explain how it would work, but everybody would want to have one.

In a further step, it would be more usefull Doraemon’s pocket. Doraemon is able to take any magic artifact out of his magic pocket.

From the point of view of a historian, you could use it as a kind of vortex or time machine to get ancient times objects and solve old misteries now, like the Antikythera mechanism or the Baghdad battery.

Fig.07 Antikythera mechanism

Fig.08 Baghdad battery




Posted in Uncategorized | Leave a comment

A covered gift for Christmas

Wait a minute! What is Eaton’s? Who is Eaton? For those of us that are not Canadian or don’t know much about history of Canada, those are the first questions we must answer. At least, this was my case.

Thanks to Wikipedia, now I know Timothy Eaton founded Eaton’s in 1869 in Toronto, Eaton’s became was once Canada’s largest deparment store retailer and its catalogue was found in the homes of most Canadians.

Once I laid the groundwork, I chose my books from the Eaton’s Fall and Winter catalogue from 1913-14. I wish I could have got a pic of the cover, but I did not find it either Archive.org or Google Books, or Hathi Trust or Gutenberg Project. Or Google Images. There are many others Eaton’s catalogue but not just the Fall and Winter from 1913-14, number 108.

My books were selected from Books for young people section.

I tried to search for old but quality editions. There they go:

1) The illustrated natural history, John George Wood

Fig.01 Wood's illustrated natural history

Overview | Full-text

Google Books
Overview | Full-text

Hathi Trust
Overview | Full-text



2) Robinson Crusoe, Danie Defoe

Fig.02 Robinson Crusoe

Overview | Full-text

Google Books
Overview | Full-text

Gutenberg Project
Overview | Full-text



3) Through the looking glass and what Alice found there, Lewis Carrol

Fig.03 Through the looking glass and what Alice found there

Overview | Full-text

Google Books
Overview | Full-text

Hathi Trust
Overview | Full-text

Gutenberg Project
Overview | Full-tex


4) Fairy Tales, Hans Christian Andersen

Fig.04 Andersen's Fairy tales

Overview | Full-text

Google Books
Overview | Full-text

Hathi Trust
Overview | Full-text

Gutenberg Project
Overview | Full-tex


5) Alice’s adventures in Wonderland, Lewis Carrol

Fig.05 Alice's adventures in Wonderland

Overview | Full-text

Google Books
Overview | Full-text

Hathi Trust
Overview | Full-text

Gutenberg Project
Overview | Full-tex


6) Black beauty, Anna Sewell

Fig.06 Black beauty

Overview | Full-text

Google Books
Overview | Full-text

Hathi Trust
Overview | Full-text

Gutenberg Project
Overview | Full-tex

7) A child’s garden of verses, Robert Louis Stevenson

Fig.07 A child's garden of verses

Overview | Full-text

Google Books
Overview | Full-text

Hathi Trust
Overview | Full-text

Gutenberg Project
Overview | Full-tex


8 ) Gulliver’s Travels, Jonathan Swift

Fig.08 Gulliver's travels

Overview | Full-text

Google Books
Overview | Full-text

Hathi Trust
Overview | Full-text

Gutenberg Project
Overview | Full-tex

9) Mother Goose’s rhymes jingles and fairy tales

Fig.09 Mother Goose's rhymes jingles and fairy tales

University of Florida, Digital Collections
Overview | Full-text







10) A child’s story of the Bible, Mary Artemisia Lathbury

Fig.10 A child's story of the Bible

Overview | Full-text

Gutenberg Project
Overview | Full-tex





11) A child’s life of Christ, Mary Artemisia Lathbury

Fig.11 A child's life of Christ

Overview |Full-text

Hathi Trust
Overview | Full-text




12) Animal stories for little people, James Hartwell

Fig.12 Animal stories for little people








During this search, I drew some conclusions:

It’s amazing that there are lots of old books, digitized and free. Although, on the other hand, it’s strange the issue of copyright: some editions have free full-text, however, others editions of the same book have copyright.

There are some interesting features in this websites. For example, Both Archive.org and Hathi Trust allow to sort by date. Hathi Trust also let you search only full view books.

Urls are other curious point. Gutenberg Project have very simple urls as long as Google Books have extremly complex and long ones (even its permalinks).

These four sites are basic to look for digitized content. If your search doesn’t success here, it rarely will success on other sites. There were two books I could not find here: Mother Goose’s rhymes jingles and fairy tales (that I found in the Digital Collection of the University of Florida) and Animal stories for little people (in sale on Amazon and eBay).

It was impossible to find out the author of Mother Goose’s rhymes jingles and fairy tales.

Some overviews include a QR-code of the books.

The first time I heard of these project was in a Digital History class but now it’s the first time I use them. I’m delighted. They are carrying out a hard work but it is worth. I will browse them for future researches for sure!

Posted in Uncategorized | Leave a comment

Getting Started with Alaska: A Guide to Online Resources

Fig.01 Alaska

[This post was written by myself and corrected by my workmate and friend Camelia.]


Alaska is my favourite singer. Although she was born in Mexico City she has lived in Spain since she was 10. I grew up in my native Spain under her influence which has forever defined my musical tastes. I was also fortunate enough to attend six of her concerts. I have chosen Alaska for this project because I admire her and I feel we have a few dates in common:

  • Alaska sings “Jason y tú”, Alice Cooper’s “He’s back” cover, a song about the main character of Friday the 13th. This film is special to me because it premiered on the 13th of June 1980, exactly the same day I was born.
  • Alaska was also born on the 13th of June (although some years before).
  • Alaska got married on the 27th of May 2011, exactly the same day I did. (I came to know this a few weeks later.)


Fig.02 Alaska

Olvido Gara Jova, mostly known as Alaska, is a Spanish singer who was born in Mexico City in 1963. When she was 10, she went to live to Spain and at 14 she began her musical career. She became a symbol of the Movida madrileña and now she combines her artistic work with her studies on History. Her first band was called Kaka de Luxe (1977) and she didn’t sing, just played the guitar. This punk band was composed by Alaska, Carlos Berlanga (son of the Spanish film director Luis García Berlanga) and Nacho Canut, among others, and was the seed of the Movida madrileña. One year later Kaka de Luxe separated and Alaska, Carlos, Nacho and others formed Alaska y Los Pegamoides (1978). Then, they were considered one of the most important bands in Spain. At the end of 1982, they separated again and in 1983, once again, Alaska, Carlos and Nacho (and only they three) formed Alaska y Dinarama, even more important than the last, until 1989. From 1990 up to today, Alaska sing in Fangoria, the band formed by her and Nacho.

Fig.03 Punk

Here you can find out interesting things about Alaska’s life:


Kaka de Luxe (1977-78)

Fig.04 Kaka de Luxe

Kaka de Luxe (1978)

Las canciones malditas (1983)

“Rosario”, Kaka de Luxe, Kaka de Luxe:

Alaska y Los Pegamoides (1978-1983)

Fig.05 Alaska y Los Pegamoides

Grandes éxitos (1982)

Alaska y Los Pegamoides (1983)

“Bailando”, Alaska y Los Pegamoides, Grandes éxitos:

Alaska y Dinarama (1983-1989)

Fig.06 Alaska y Dinarama

Canciones profanas (1983)

Deseo carnal (1984)

No es pecado (1986)

Diez (1988)

Fan fatal (1989)

“A quién le importa”, Alaska y Dinarama, No es pecado:

Fangoria (1990-today)

Fig.07 Fangoria

Salto mortal (1990)

Un día cualquiera en Vulcano (2003)

Interferencias (1998)

Una temporada en el infierno (1999)

Naturaleza muerta (2001)


Arquitectura efímera (2004)

El extraño viaje (2006)

Absolutamente (2009)

El paso trascendental del vodevil a la astracanada (2010)

“Electricistas”, Fangoria, Una temporada en el infierno:

As a soloist

Fig.08 Alaska como solista

La bola de cristal (1985)

“Abracadabra”, Alaska, La bola de cristal:


Alaska is the author of “Transgresoras”, a book about the most important women in History that she admires, from Cleopatra, Eleanor of Aquitaine, to Dian Fossey.
Alaska herself is the target of two biographies: “Alaska”, written by Mario Vaquerizo (her husband) and “Alaska y otras historias de la movida”, by Rafa Cervera.

Transgresoras, Olvido Gara Jova (Alaska)


Alaska, Mario Vaquerizo


Alaska y otras historias de la movida, Rafa Cervera




Fig.09 Alaska and Pedro Almodóvar (old)

Fig.10 Alaska and Pedro Almodóvar (new)

Alaska is a multifaceted artist. Besides singing, she is a radio host and appears on TV shows (the last one was “Alaska y Mario” on MTV Spain). She also acted in several movies. Her most important role was “Bom” in “Pepi, Luci, Bom y otras chicas del montón”, Pedro Almodóvar’s first film. (This scene may offend your sensibility. It is subtitled in English. The original sound is unsyncronous with the picture:)



Pepi, Luci, Bom y otras chicas del montón, Pedro Almodóvar


Other information

Fig.11 Alaska - Gioconda

Alaska borrowed her name from Lou Reed’s “Caroline Says II”. She was influenced by David Bowie (another idol of mine) and the glam. She has actively fought in favour of homosexual rights. Alaska considers herself an advocate for animals and, more explicitly, anti-bullfighting. She supports drugs legalization and has openly admitted to be anti-religion. She is interested in medieval history and is currently studing an undergraduate degree in History.

Fig.12 Alaska - Virgin

Fig.13 Bartlaska and The Simpsonoides











All these personal web pages and blogs contain very interesting information. However, they often tend to be subjective. The most unbiased information can be found on Wikipedia. Crowdsourcing avoids personal opinions.

Fig.14 Alaska anti-bullfighting

From now on, this post will also be an important reference of Alaska (in English!).


Fig.15 Alaska's looks

Hey girls, if you have liked Alaska’s style, there is a website gives you some steps in order to look like her:


Alaska’s Facebook page:

Fangoria’s web page:

Alaska and Mario (her husband)’s blog:

Posted in Uncategorized | Leave a comment

How would a historian tag my blog?

Tagging blog’s posts is a good practise of bloggers. It is an easy way to classify contents, which helps readers to search for information. I created this blog for Digital History class but I have not either categorized or tagged. This class is leaded by a historian mainly for historians. How could I be sure of choosing the proper historian tags? How would a historian tag my blog? In this post, I present my experiment.

The idea is to study how historians tag their own blogs and apply their tags to my blog. For this, it is necessary to analyze historians post contents, my own posts and deduce what of their tags are relevant to my blog. Let’s go!

The first step was to look for appropiate blogs. I thought that I should use at least two tagged blogs. My first option was Bill Turkel’s blog, williamjturkel.net/updates/, but Bill does not usually tag this blog, so I had to search Google “digital history blog”. I found one called Digital History Hacks (2005-08) and yeah!, it is another (older) Bill’s blog (casualty?). I decided that the second blog should be one of my classmates’. I explored all their blogs and the only one full of tags was Dave’s. So Backwards with Time was my second choice.

Once I had my two blogs, the second step was to select their posts. 20 posts would be enough: 15 from Bill’s and 5 from Dave’s. For each post, I extracted the 10 most frequent words (I call this set keywords) and associated them to the tags supplied by the bloggers. I used Wordle to achieve it. Wordle is an online tool that read a text and show its word cloud.

Fig.1 Word cloud

Wordle also offer a tool to count words and sort by frequency. I had to discard some common words and select the important ones. For the example in Fig.1, I set up the next keywords: {physical, past, machine, fabrication, spaces, historians, history, digital, humanist, computer} for one of the Bill’s posts. And the tags supplied by Bill were: {bricolage, DIY, fabrication, hacking, physical computing}. Here is the complete set of posts along with extracted keywords and blogger’s tags. And here are my posts (remember that my blog has no tags, that is the reason for this experiment):

Keywords: web, information, knowledge, machine, data, semantic, historians, abundance, technologies, computer.

Keywords: digital, history, internet, technologies, field, future, fashion, research, world, past.

Keywords: life, real, world, virtual, Internet, users, cyberspace, facebook, examples, friends.

This entry corresponds to this post, which did not exist when I did the experiment. However, I knew the issue of it and I added some keywords according to my own criterion.
Keywords: blog, data, historians, online, programming, tag.

The third step was to apply some Artificial Intelligence technique to deduce the tags for my posts by basing on their keywords and keywords and tags from the posts before. This collection of 20 posts is called training set in Machine Learning field. The technique would be the ID3 algorithm. This algorithm can deduce the value of an attribute from other attributes. That is, ID3 works with a set of examples (training set). Each example has attributes and one of them is the target (that we want to learn about). In the training set all the information is provided. For example, Fig.2 shows “real facts” about what days a team played ball.

Fig.2 Example of training set

What will happen the day D15 and others? ID3 can deduce whether the team will play or will not by basing on the outlook, temperature, humidity and wind of that day. For that, ID3 builds a decision tree* (Fig. 3):

Fig.3 Decision tree

Or expressed in rule format*:

  • IF outlook = “sunny” AND humidity = “high” THEN play ball = “no”
  • IF outlook = “sunny” AND humidity = “normal” THEN play ball = “yes”
  • IF outlook = “overcast” THEN play ball = “yes”
  • IF outlook = “rain” AND wind = “strong” THEN play ball = “no”
  • IF outlook = “rain” AND wind = “weak” THEN play ball = “yes”

* In order to build the smallest tree (the simpliest rules) it is necessary to choose on top the tree the “best” attributes, that is, the attributes with the least entropy (entropy gives an idea of homogeneity).

The pseudocode of ID3 can be found here.

My goal was to apply ID3 in order to teach the computer the rules to, given the training set of posts, learn when a tag appears in a post and when not (I mean, its classification). In this experiment, an example is a post, an attribute is a keyword and the target is a tag.

First of all, I had to set up the total set of keywords: the intersection of my own keywords and historians’ ones (19 in total): {blog, computation, data, digital, examples, historians, history, information, knowledge, past, reality, research, users, virtual, web, world, online, programming, tag}. It is necessary to say that some keywords with similar meanings were fusioned into one (for instance, computer and computation). Secondly, I had to choose what tags from training set were relevant to my blog (I mean, the most likely tags applicable to my posts). I selected these 30: {browser, computational history, data mining, digital, digital history, diy, entropy, google, history, html, interdisciplinarity, machine learning, markup language, ocr, online research, physical computing, programming, reality, representation, search, search engines, teaching, technology, text mining, thing knowledge, turing test, virtual reality, web resources, wikipedia, wikis}.

At this point, I had 20 posts, 19 keywords and 30 tags. For each post, which is the correspondence between its keywords and its tags?

Fig.4 Training set

The training set is displayed in Fig.4. For each post, a “” (“yes“) means that the post has the keyword and “no“, the opposite. The fourth step was to programme the ID3 algorithm and represent this information properly. I had to add a last column with the tag I wanted to learn and give the values for each historian’s blog. I repeated the process 30 times (one for each tag)! The programming language I best know is Java, and I found an implementation of ID3 in this language, so the only thing I needed was to create a project in Eclipse, provide the training set in a proper format and run the application. As an example, here is the rule to learn when a post have to be tagged/classified as “history”:

  • IF knowledge = “no” AND world = “no” THEN history = “no”
  • IF knowledge = “no” AND world = “yes” AND digital = “yes” THEN history = “no”
  • IF knowledge = “no” AND world = “yes” AND digital = “no” THEN history = “yes”
  • IF knowledge = “yes” THEN history = “yes”

That means, “if keyword knowledge belongs to the post, then the post is tagged with history; in other case, if knowledge doesn’t belong, but world does and digital doesn’t, then the post is tagged with history too”. This rule expresses the idea that (traditional) history is linked to knowledge and world, but not to digital. On the other hand, the rule that learn digital history set that this tag is related in some sense to online, history, digital, knowledge, historians and web (this rule is too complex to be shown).

The fifth step is, for all rules learned, to check if my posts satisfy them. According to the example above, the post titled A new scarcity has the tag history. Actually, is the only post where I mention History in the past and compare it with current methods of doing History. I did this step manually, i.e., I analyzed every rule and checked if the tag was suitable for each of my posts.


The result was a set of learned tags* for my posts. My blog was originally untagged but now it has its own tags!

Keywords: web, information, knowledge, machine, data, semantic, historians, abundance, technologies, computer.
Tags: browser, data mining, digital history, google, history, html, interdisciplinarity, machine learning, markup language, technology, text mining, virtual reality, web resources.

Keywords: digital, history, internet, technologies, field, future, fashion, research, world, past.
Tags: digital history, google, online research, search engines, teaching, technology, virtual reality, web resources, wikipedia.

Keywords: life, real, world, virtual, Internet, users, cyberspace, facebook, examples, friends.
Tags: digitial, digital history, history, online research, reality, virtual reality.

This entry corresponds to this post, which did not exist when I did the experiment. However, I knew the issue of it and I added some keywords according to my own criterion.
Keywords: blog, data, historians, online, programming, tag.
Tags: data mining, digital history, entropy, machine learning, ocr, programming, text mining, turing test, wikis.

*green = suitable tag
*red = unsuitable tag


This experiment has shown some interesting results to me:

  1. About 75% of tags are suitable with the post content.
  2. Every posts were tagged as “digital history” (good… after all, this is a blog for Digital History class).
  3. A new scarcity was tagged with so relevant tags like “google“, “markup language” (let’s remind that this post mentions XML) or “web resources“.
  4. Neither field nor fad nor fashion was classified as “online research“, “search engines” or “web resources“.
  5. “What is real?” was tagged with coherent tags such as “reality” or “virtual reality” (just the topics of this post).
  6. This post has been tagged with “data mining“, “entropy” (remember that entropy was not a keyword of this post and even so, it has been correctly classified), “machine learning“, “programming“, “text mining” or “turing test” (a test of a machine’s ability to exhibit intelligent behaviour).

These results may not be perfect and the methodology followed is far from scientific method. However, it is a good approach. And this prove one more time the power of Artificial Intelligence, in this case, Machine Learning and ID3 algorithm.

Posted in Uncategorized | Leave a comment