Deep Learning- self driving car (hobby)

Really it is a Tesla.. Four electric motors, batteries, sensors, and two cameras.

Okay…kinda like a Tesla.

I have been fasinated with neural networks going back to the early 90’s when I was doing work on forms recognition and hand writing analysis. The idea lost appeal for a long time but has had a resurgence as “deep learning” is being used for processing large amounts of data. Self driving cars is one area where they are making gains. Recently I watched the series by Dr. Lex Fridman(MIT 6.S094: Introduction to Deep Learning and Self-Driving Cars). Besides covering a lot about neural networks he talked about how Tesla instruments their cars to “learn”.

Is the car really learning to drive? Not exactly. By driving the car around it gathers data that can be used later on. The data includes images, sound,temperature, GPS and driver reactions. All of this data is feed into a neural network such as tensorflow. The car knows only what it has “seen” before. The system is memorizing every possible situation. Anything that occurs out of context from what it knows could cause trouble. But the more data that is gathered the less likely it will be that something unforeseen will occur.  Of course AI is always improving and at some point will be able to make better choices – zero shot learning.

I wanted a way to experiment with this myself. Buying a Tesla is out of the question. I could add sensors to my car but that is just asking for distracted driving. Also, I work remote and don’t drive alot. The next best thing would be create a small ‘car’ that I could use to gather data.

This car is not going to be on a road. Which means  it wont have things like lane lines to guide it. I might build a track where it could be driven. Or just wander around the house and scare the dog and cat..

The picture above shows a small RC car. It has a motor for each wheel. Steering is similar to tank driving. Turns are done by slowing down one set of wheels while speeding up another. Sharper turns can be done by reversing the wheel instead of just slowing them down. Its not very smooth but it gets the job done.

The car will is first configured for data gathering. I am using an Arduino with blue tooth for communications.  I wrote a simple app for my Android tablet. There are six ultra sonic sensors for determining distance. I have two cameras(only one in the picture) mounted on the front. These will record stereo images which will help determine depth. The first thing  I learned is that the ultra sonic sensors will only see a small portion of what is in front of them. The first trial run they completely missed the table or chair legs. The sensors need to sweep the area in front so as to create a point cloud. For this I am adding a pan and tilt control to the sensor mount. Two servos will move the sensor array. Data is being recorded to a flash drive.


I am recording the data at one second intervals. The car doesn’t move very fast so this rate should be sufficient. The value at each sensor, two camera images and the drive command are recorded.

Currently I am in practice mode, refining the app to better control the  car. I found some sensors for the wheels to detect the speed of rotation. I think I’d need to upgrade the Arduino to add any more devices so I’ll leave them off for the time being.


More later…

Posted in deep learning

Data Centric programming – End of the Cloud


Interesting. The idea of data centric computing is something I have been thinking about. The rise of machine learning plays a big part of this…

The End of Cloud Computing

Posted in Uncategorized

Unity – pong game.. why not


The original pong game was not much compared to what would come later. For many it was amazing that something like was available for use in the home.  I have seen others make pong games in Unity and thought it might be fun to try. Pictures of the console show controls for two players. For this project I’ll make the second player the computer. Yeah it will hard to bet but….

Using Unity 5.5 start out with a basic camera.



Using a graphic tool I made a paddle and a ball(png format). Create a folder named Sprites and drop in the two images. Create a empty game object named ‘player’


Drag the paddle sprite onto the player game object and notice the Sprite Renderer show up as a component of the player.  Select the player object and then Select Component->Physics->RigidBody from the menu. The paddle will need this in order to bounce the ball.

unity pong 3.png

Create  a new folder named Physics. Select Asset->Create->PhysicsMaterial. Name it bounce. When applied to the paddle will cause the ball to bounce back.


In order for the paddle to react to the ball hitting it we need to add a collider component. A BoxCollider will do. Set the material of the collider to the bounce material.


Create a new game object called Ball and add in the ball sprite. Add a RigidBody and collider as well.

unity pong 6.png

So far there is one paddle and a ball. Not so good?

There needs to be some code in order to make this work. Create a new folder named Scripts and add a new c# file named paddle.cs. Below is what the code should look like( or close).

The Update function is part of the core Unity MonoBehavoir class. It is called once per frame. The variable ‘gameObject’ refers to the object to which this script is attached.

Input.GetAxis(“Vertical”) will return  -1 or 1 depending if the down or up arrow key is pressed.  This value times the speed will be used to increment the position of the game object.

playerPosition = new Vector2(-20,Mathf.Clamp(yPosition,-13,13));

This line creates a new 2d Vector with a fixed X location  of -20(where I placed the paddle on the screen), and a new value of Y based on the new yPosition. Mathf.Clamp() restricts the y value to between -13 and 13. These values were determined by experimentation.

The last line transforms the object to a new position. Since the x value is always -20 the paddle will only move up or down.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class paddle : MonoBehaviour {
 public float speed=3;
 public float yPosition;
 public Vector2 playerPosition;
 // Update is called once per frame
 void Update () {
    playerPosition=new Vector2(-20,Mathf.Clamp(yPosition,-13,13));
    gameObject.transform.position =playerPosition;





Posted in Uncategorized

Multimedia Mobile application using Low Power Bluetooth(BLE)

… in a museum. You walk by an painting suddenly your phone becomes the voice of the artist and begins to speak to you about the piece…. Bringing art to the guest.

In 2008 I developed an application that used RFID to trigger events on a mobile device(PDA).  The main purpose was to be an Electronic Docent, a museum guide. Exhibit information delivered directly to the guest.

Unfortunately RFID never became a consumer  friendly technology. Fast forward to 2016, smart phones are prevalent and low power bluetooth(ble) devices are becoming ever more popular. In January myself and  two others began development on a new version of the application.

The PDA has been replace by smart phones and tablets. Both iOS and Android hold major positions in this area. Both have support for standard Bluetooth as well as BLE.

How it works

The application running on the device is designed to look for BLE tags.  When one is located a request is made to a server to search the database for the tag id. If the id is located information about the media is returned and the user can select if they want to view the media or not.


The tags and media have to be associated.This is done by personnel managing the location. They understand both the content and how they would like it displayed to the visitor.




One of the biggest decisions was how to develop the mobile portion of the application.

  1. Native, iOS and Android
  2. Cross platform framework: Xarmin,QT
  3. Javascript/HMTL5 framework: Apache Cordova ( formerly  Phonegap), Ionic.


Until a few years ago mobile applications were required to be developed in Java  or Obj C. Apple refused to accept applications cross compiled or interpreted into Obj C. The draw back is that an application had to be developed twice. Maintenance was much harder since it required twice the effort in coding and QA.

On the other side,native applications had the ability to interact with the devices hardware, sound, touch, gps, and accelerometer.

Cross platform framework

Frameworks such as Xarmin and QT give the developer to write one application and deploy it to multiple mobile platforms.

Xarmin: Based around C# and created by team that created Mono. Xarmin takes C# code and creates native code for iOS or Android. Microsoft now owns Xarmin and has integrated it into its Visual Studio IDE.

QT: This has long been a popular framework for developing applications for Windows, OSX and Linux. When mobile support was added there became license issues. Also QT has less of a native look and feel.

Javascript/HMTL5 framework: Tools such as Ionic use the Angular.js framework and Cordova libraries to create cross platform applications. The key to the success has been the Cordova(Phonegap) libraries. These provide access to the device hardware which lets the application behave more like native code.

We chose Ionic. There were too many issues with either Xarmin or QT. Developing two separate native applications was out of the question.

Serving up data

Once the mobile application find a BLE tag it needs to get the information associated with the tag. This means an application server. This was a simple choice, Java, Hibernate ,MySQL and Tomcat. This combination is proven, solid and will work well on something like AWS. One advantage to MySQL is that AWS’s Aruora database is MySQL  compatible and an easy replacement if very high performance is required.

Server side

Using Java and Hibernate makes the server work pretty straight forward. The code is built on layers. Entities,DAO, Service, Controller.


Each entity represents a table in the database.

  • ExhibitTag
    •   This represents a single ble tag
  • ExhibitTagMedia
    • This represents the media associated with tag. A tag could have more than one media component.
  • Location
    • This represents the location of the tag
  • Organization
    •  This  represents the site or organization managing the tags.
package com.tundra.entity;
import java.util.Date;
import java.util.Set;
import javax.persistence.Basic;
import javax.persistence.CascadeType;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.ManyToOne;
import javax.persistence.OneToMany;
import javax.persistence.Table;
import javax.persistence.Temporal;
import javax.persistence.TemporalType;
import com.fasterxml.jackson.annotation.JsonIgnore;

@Table(name = "exibittag")
public class ExhibitTag implements Serializable {
private static final long serialVersionUID = 1L;
 @Basic(optional = false)
 @Column(name = "Id")
 private Integer id;
 @Basic(optional = false)
 @Column(name = "Name")
 private String name;
 @Basic(optional = false)
 @Column(name = "Tag")
 private String tag;
 @Basic(optional = false)
 @Column(name = "Description")
 private String description;
 @Basic(optional = false)
 @Column(name = "Created")
 private Date created;
 @Basic(optional = false)
 @Column(name = "Updated")
 private Date updated;
 @JoinColumn(name = "Location_Id", referencedColumnName = "Id")
 @ManyToOne(optional = false, fetch = FetchType.EAGER)
 private Location location;
 @OneToMany(cascade = CascadeType.ALL, mappedBy = "exhibitTag", fetch = FetchType.EAGER)
 private Set<ExhibitTagMedia> exhibitTagMediaSet;

// setters and getters removed

 public int hashCode() {
 int hash = 0;
 hash += (id != null ? id.hashCode() : 0);
 return hash;

 public boolean equals(Object object) {
 if (!(object instanceof ExhibitTag)) {
 return false;
 ExhibitTag other = (ExhibitTag) object;
 if ( == null && == null) {
 return super.equals(other);
 if (( == null && != null) || ( != null && ! {
 return false;
 return true;

 public String toString() {
 return "Exibittag[ id=" + id + " ]";



Spring will create the query for FindByTag() automatically

package com.tundra.dao;
import java.util.List;
import org.springframework.transaction.annotation.Transactional;
import com.tundra.entity.ExhibitTag;

public interface ExhibitTagDAO extends JpaRepository<ExhibitTag, Integer> {
 List<ExhibitTag> findByTag(String tag); 


The service layer is how the controller will interface with the server.

package com.tundra.service;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import com.tundra.dao.ExhibitTagDAO;
import com.tundra.dao.ExhibitTagMediaDAO;
import com.tundra.dao.OrganizationDAO;
import com.tundra.entity.ExhibitTag;
import com.tundra.entity.ExhibitTagMedia;
import com.tundra.entity.Organization;
import com.tundra.response.ExhibitTagSummaryResponse;

public class TundraServiceImpl implements TundraService {
 ExhibitTagDAO exhibitTagDAO;
 private OrganizationDAO organizationDAO;
 ExhibitTagMediaDAO exhibitTagMediaDAO; 
 /* (non-Javadoc)
 * @see com.tundra.service.TundraService#findAllOrganizations()
 public List<Organization> findAllOrganizations() {
 return organizationDAO.findAll();
 /* (non-Javadoc)
 * @see com.tundra.service.TundraService#findOrganization(int)
 public Organization findOrganization(int id) {
 return organizationDAO.findOne(id);
 public List<Organization> findByName(String name) {
 return organizationDAO.findByName(name);
 public List<Organization> findByNameAndCity(String name, String city) {
 return organizationDAO.findByNameAndCity(name, city);
 public ExhibitTag findByTag(String tag) {
 ExhibitTag et = null;
 List<ExhibitTag> list = exhibitTagDAO.findByTag(tag);
 if( list != null && list.size() ==1){
 et = list.get(0);
 return et;

 public List<ExhibitTag> findAllTags() {
 return exhibitTagDAO.findAll();
 public ExhibitTagMedia findMediaByTag(String tag) {
 ExhibitTagMedia media = null;
 List<ExhibitTagMedia> list = exhibitTagMediaDAO.findByExhibitTag(tag);
 if( list != null && list.size() ==1){
 media = list.get(0);
 return media;
 public ExhibitTagSummaryResponse findSummaryByExhibitTag(String tag) {
 ExhibitTagSummaryResponse summary = null;
 List<ExhibitTagSummaryResponse> list = exhibitTagMediaDAO.findSummaryByExhibitTag(tag);
 if( list != null && list.size() ==1){
 summary = list.get(0);
 return summary;


The controller layer  represents the REST layer. The mobile app will interface with the server via the controller.

package com.tundra.controller;
import java.util.List;
import javax.servlet.http.HttpServletResponse;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseBody;
import com.tundra.entity.ExhibitTag;
import com.tundra.entity.ExhibitTagMedia;
import com.tundra.response.ExhibitTagSummaryResponse;
import com.tundra.service.TundraService;

public class ExhibitController implements Serializable {
private static final String ERROR_PREFIX = "Whoops : ";
 private static final long serialVersionUID = 1L;
 private TundraService tundraService;
 @RequestMapping(value="/{tag}", method=RequestMethod.GET)
 public @ResponseBody ResponseEntity<?> getExhibitTagByTagId(HttpServletResponse httpResponse, @PathVariable(value="tag") String tag) {
 try {
 return new ResponseEntity<ExhibitTagSummaryResponse>(tundraService.findSummaryByExhibitTag(tag),HttpStatus.OK);
 } catch (Throwable t) {
 return new ResponseEntity<String>(ERROR_PREFIX + t.getMessage() ,HttpStatus.INTERNAL_SERVER_ERROR);
 @RequestMapping(value="/media/{tag}", method=RequestMethod.GET)
 public @ResponseBody ResponseEntity<?> getExhibitMediaByTagId(HttpServletResponse httpResponse, @PathVariable(value="tag") String tag) {
 try {
 return new ResponseEntity<ExhibitTagMedia>(tundraService.findMediaByTag(tag),HttpStatus.OK);
 } catch (Throwable t) {
 return new ResponseEntity<String>(ERROR_PREFIX + t.getMessage() ,HttpStatus.INTERNAL_SERVER_ERROR);
 @RequestMapping(value="/list", method=RequestMethod.GET)
 public @ResponseBody ResponseEntity<?> getExhibits(HttpServletResponse httpResponse) {
 try {
 return new ResponseEntity<List<ExhibitTag>>(tundraService.findAllTags(),HttpStatus.OK);
 } catch (Throwable t) {
 return new ResponseEntity<String>(ERROR_PREFIX + t.getMessage() ,HttpStatus.INTERNAL_SERVER_ERROR);


With the server code in place its time to look at the mobile app.

As started earlier we are using the Ionic framework. It is based on Javascript/Angular.

The structure of a Ionic project is shown below.


The areas we change are app.js, controller.js and service.js. Index.html is modified only slightly to include our files.

<!-- cordova script (this will be a 404 during development)-->

<!-- your app's js -->

The templates folder holds the html files for the various screens. Since we started with a tabbed ionic project we have two core html templates, tab.html and tab-dash.html.The tab format allows tabbed pages as a navigation. We are not using this format and  will be renamed later on.


<ion-tab title=”My Docent” icon-off=”ion-ios-pulse” icon-on=”ion-ios-pulse-strong” href=”#/tab/dash”>
<ion-nav-view name=”tab-dash”></ion-nav-view>

The main screen is in tab-dash.html

 <ion-header-bar align-title="center" class="bar-stable">
<h1 class="title">Available Exhibits</h1>
</ion-header-bar> </ion-content>

The screen is very basic.


The other screens are used to represent the media types, text,video,and audio.html. Here is an example of a text view.


The app.js file is loaded first and sets up the basic structure. The application uses the Ionic Bluetooth Low Energy (BLE) Central plugin for Apache Cordova . If the app is running on a real mobile device(not in a browser on a PC) the object ‘ble’ will be defined. On a PC this will not be valid. The app.js  run function will check for this.

 if(typeof(ble) != "undefined"){
      function () {
        document.getElementById("bleStatus").style= "color:green;";
     },function () {
         document.getElementById("bleStatus").style= "color:red;";


The controller layer manages the controls from the html(UI) code.

Example: It the main html file there is a button to start scanning.
<button ng-click=”startScanning()” class=”button”>Search</button>

In the controller there is the startScanning function. The BLEService is located in the service layer.

$scope.startScanning = function () {
    BLEService.connect(function(exibitTags) {
    $scope.exibitTags = exibitTags;
   $scope.myText = "startScanning";
    isScanning = true; 

In the service layer.

.service("BLEService", function($http){
  if (typeof(ble) != "undefined") {
     ble.scan([], 30, onConnect, onError);

The onConnect function returns a list of Bluetooth tags  located.

Once the list of devices is returned, the REST service is called to check the tags against the database. The server returns:

  • Organization Name
  • Location Name
  • Exhibit TagName
  • Exhibit TagId
  • Exhibit Tag
  • Exhibit Tag MimeType

The user selects which exhibit they want to view.

Testing the app locally

Ionic can run the app locally by using the command ‘ionic serve’ from the project folder.

C:\Users\rickerg0\workspace\Tundra>ionic serve
The port 35729 was taken on the host localhost - using port 35730 instead
Running live reload server: http://localhost:35730
Watching: www/**/*, !www/lib/**/*, !www/**/*.map
√ Running dev server: http://localhost:8100
Ionic server commands, enter:
 restart or r to restart the client app from the root
 goto or g and a url to have the app navigate to the given url
 consolelogs or c to enable/disable console log output
 serverlogs or s to enable/disable server log output
 quit or q to shutdown the server and exit

The basic screen as viewed in FireFox.


Deploy the app to an Android device from Windows

Make sure the device is connected via the USB port. Also set the developer option on the device. If you don’t do this last step the device will not allow Ionic to connect.  From the terminal issue the command ‘ionic run android’. This will build the apk file and install it on the device.



Posted in Uncategorized

Podcast Interview with Greg Ricker

My podcast with Rik Van Bruggen

Posted in Uncategorized

Building BB8

Soon after the latest Star Wars movie came out, Sphero introduced its model of the BB-8 robot.


Soon afterward people were taking it apart to see how it worked.


Two designs emerged, the hamster cage


and single axis


A few people started posting DIY projects trying to build a “working” BB-8 robot.

I decided to try my hand at building a “working” BB8 as well. Starting in January with the goal of being ready for PortCon (Portland, ME ) in June.

The Sphere

There are three primary methods for constructing  the sphere.

  1.      Purchase a pre made plastic sphere( two halfs)
    1.      This can be expensive. There is also the issue of assembling the sphere
  2.      3D print various panels and then assemble them to form a sphere.
    1. Reading what others have said about this process its not simple. Because of the size of the panels and their complexity this is a difficult process. Besides being expensive its hard on the printer. A number of people report having to repair or replace their printers.
  3.      Construct a sphere from a material such as fiber glass.
    1. This started out to be the common method most people used. An early DIY project made this seem much simpler than it really is. This method involves covering  a ball,beach or yoga, with a paper/canvas mache mixture. The  BB8 community decided that the body is about 20 cm in diameter. The ball in the DIY project is not that big. As it turns out, finding a beach ball in Maine in January is impossible. So it was off to Amazon.


All three balls are listed as 20 cm. Hmmmm…

First attempt with paper and canvas, following the DIY project.

img_20160301_192948666 img_20160311_070328

Clearly this was not going to work. I decided to use fiber glass instead of canvas. I also found a 20 cm ball at a party store.

img_20160416_102655713 img_20160416_104108166 img_20160416_104059687

The Head

img_20160319_095557159 img_20160319_095709095 img_20160319_095811288 img_20160324_070928610

The Drive Train(part 1)

img_20160713_185919 img_20160713_185911 img_20160713_185900 img_20160616_230545 img_20160616_230535

June, Portcon  Portland Maine

Despite the drive train issues, BB8 still spoke and the light worked. So it was off to the con.


img_20160625_092812 img_20160625_205258061-1img_20160625_211230032

Its back to the drawing board with the drive system…

The new drive mechanism.

I started over with new motors, frame and servos. So far its looking a lot better.



Two videos before this all gets put back in the ball for a test run..



Posted in Uncategorized

Graph of a musical groups’ albums, songs and lyrics

The Idea

Being the dad of a teenage daughter means I listen to a lot of the current music. Lady Gaga, Taylor Swift. Recently is all about One Direction. As “” recently said “One Direction owns the internet in 2015. Sometimes I hear “this is a sad song” or “this is a happy one”. What could I learn about their music using Neo4j? Could one derive any sort of sentiment from the lyrics? Could I get my daughter interested in this? Only one way to find out…

How to start

The first step was to learn more about the group. There are currently four members but for most of their albums there were five. Harry Stiles, Niall, Liam, Zayn and Louis.They have released five albums, Four, Take me home, Up all night, Midnight memories and Made in the A.M. With the help of my daughter we found a site that had the lryics to all of the songs. What I found was that while some of the song files contained information about who was singing what section, many did not. I was hoping that maybe the sentiment could be aided by knowing the singer. Maybe Harry always sings sad/ break up songs(he did date Taylor Swift). Since this information isn’t consistent I couldn’t count on it.

Song sentiment ?

I felt it was important to have the ability to track lyrics by location in the song, row and column. This way one could query “what words appear the most often at the start(0,0) of a song? How often do certain word combinations( “I” and “you”) appear on the same line? This last question could be useful in better understanding sentiment?


Tools: Python, py2neo, R and RNeo4j.

The Model

The first step was to organize the songs into files by album. Once this was done it was simple to get Python to read in a list of albums, songs titles, and lyrics(words). The graph…

I decided that a Group node would refer to a band or singer. A group would be made up of members and members were artists. For bands this is fine. I made the choice to treat single acts the same as way. So Lady Gaga or Taylor Swift would be a considered a group,member and artist.


  • Group
  • Member
  • Artist
  • Album
  • Song
  • Lyrics


  • Album BY Group
  • Lyric IN Song
  • Song ON Album
  • Member ISA_ARTIST Artist
  • Group HAS_MEMBER Member


For the gist I restricted the data to one song per album and reduced the lyrics by two thirds. Even with this there are still 581 lyric nodes. There are 232 unique words. The difference is due to words being repeated but in different locations. The word “you” is found 28 times in the five songs.

Query 1

0 rows
5641 ms
| No data returned. |
Nodes created: 602
Relationships created: 609
Properties set: 1774

Find all songs where the word “my” appears

Query 2

MATCH (l:Lyric{name:"my"})-[r0:IN]-(s:Song) RETURN,l.row,l.column

Show distinct lyrics in the song “If I Could Fly”

Showing 1 to 10 of 56 entries

Query 4

MATCH (l:Lyric)-[r0:IN]- (n:Song) WHERE =~ "(?i)said" RETURN n,l

Show all lyrics in Act My Age.

Show all artists and members for the group

Show all songs on all of the albums. For the gist there is only one song per album.

Show all albums and members for the group

Show all of the lryics for the song “Kiss you”. There are some connections of lryics to other songs. This is becuase those lryics are used in the same location. The lryic “Baby” is used in “Kiss Me” and “What makes you beautiful” in the same row and column.

A query to find songs where the words ‘I’ and “you” are on the same line. The query works well in Python since I can filter out return values of 0. This type of search will be help when looking for phrases, words on the same line.

Query 5
MATCH (l1:Lyric{name: 'I'}) --(s:Song)
MATCH (l2:Lyric{name :'you'}) --(s:Song)
RETURN CASE WHEN l1.row = l2.row THEN [l1,l2,s] ELSE 0 END


Song Act My Age










Actual line, row 3 :”I can count on you after all that we’ve been through”

If I Could Fly










Actual line, row 5 :”I hope that you listen ’cause I let my guard down”

Sentiment and R

Below is a bar chart of the top ten most common lyrics. “I” and “you” are popular.

Sentiment The last thing to consider is sentiment. Using the simple process of positive and negative words I’d like to see if one make a determination of sentiment. There isn’t a song word list that I could find so I elected to use the AFINN list. Following examples from Jeffrey Breen and Andy Bromberg I was able to get some results. I didn’t divide the songs up into training and test sets, instead I picked two songs and processed them. My daughter suggested that “Best Song Ever” would be happy and “If I could Fly” would be sad.

The process start with a query:

graph = startGraph(“http://localhost:7474/db/data/&#8221;) query = “MATCH (l:Lyric) -[r0:IN]-(n:Song{name:’best song ever’}) RETURN”

ta = cypher(graph, query)

This returned a list of lryics. Next I counted the number of lyrics that matched a positive or negative word in the AFINN list. I classified the words into “reg”, scale 1-3 and “very” scale 4-5 for both positive and neg.

Using R functions naiveBayes() and predict(). The method is very simple but the results do follow that Best Song Ever “happier” then If I Could Fly. It would be good to get One Directions opinion on this.

“Best Song Ever” reg very positive 10 3 negative 3 0

“If I Could Fly” Reg very positive 1 0 negative 4 0

One thing I noticed is that simple word matching isn’t sufficient.For movie reviews or emails this may work. Song are more complex.

Example. A happy song might have the line “I love you” while a sad song might have a line “I used to love you”. Both have the positive word “love” in them but the second line could be viewed as sad, love lost. This is where querying lyrics on the same line could help. Its more complex than matching positive and negative words.

Conclusion This was fun and I got a little Father daughter time in as well. I’d like to pursue this to see what can be done by considering phrases and connected words.

Next up: Lady Gaga

Posted in Uncategorized