Thursday 25 June 2015

Dr Hearing Open Source Project

Disclaimer: the following post contains extracts from my final year thesis and may not be re-used in any form or fashion without my permission, thank you for reading..









Project DHOSP - What is DHOSP?
Dr Hearing Open Source Project was my final year project which was designed to identify hearing impairments in adults. The project was completed but lacked a production quality standard for publication, so apart of my project scope was to bring the project to my fellow colleagues and developers so that we may develop the best Android application for identifying hearing deficiencies.

So what is a hearing test?

A hearing test provides a methodology to diagnose an individual’s hearing impairment or deficiencies. ISO 8253-1:2010 is an international standard for PTA (Pure Tone Audiometry). This standard specifies procedures and requirements for PTA air conduction and bone conduction. Pure tone audiometry is the measurement of an individual’s hearing across a range of test frequencies using a standardized test method that specifies the procedure for determining the threshold, the range of test frequencies and presentation levels, and the way thresholds are presented in a graphical manner and including the symbols user to depict objects. We use PTA to evaluate possible hearing losses and to determine the type of hearing loss that an individual may have.



Hearing loss can be defined as the amount a person’s hearing level changes as a result of some adverse influence. This means that some structure or function of the ear that is crucial to hearing has been damaged.


There are three main forms of hearing loss; Sensor neural affecting the cochlea, Conductive hearing loss affecting the ear canal and mixed hearing loss which is a combination of both sensor neural and conductive hearing loss.


The World Health Organization (WHO) defines “disabling hearing impairment in adults as a permanent unaided hearing threshold level (average for frequencies 0.5, 1, 2, 4 kHz) for the better ear of 41 dB or greater (WHO, 2001).2 In children under 15 years of age, disabling hearing impairment is defined as permanent unaided hearing threshold level (average for frequencies 0.5, 1, 2, 4 kHz) for the better ear of 31 dB or greater.”


The auditory pathway includes the external ear, the middle ear and the inner ear, followed by the auditory nerve ending up in the auditory centres in the auditory cortex.




· The external ear consists of the pinna, ear canal and eardrum. Sound travels down the ear canal, through the eardrum and causing it to move or vibrate.


· The middle ear is a space behind the eardrum that contains three small bones called ossicles. This string of tiny bones is connected to the eardrum at one end and to the oval window at the other end which connects to the inner ear. Vibrations from the eardrum cause the ossicles to vibrate which, in turn, creates movement of the fluid in the inner ear.


· Movement of the fluid in the inner ear, or cochlea, causes changes in hair cells. This movement of the hair cells sends electric signals from the inner ear up the auditory nerve to the brain.


What is an Audiogram?



An audiogram plots decibel (dB) values on the Y-Axis. If a loss of hearing is present then the graph will have higher points with increased dB signal where the person is having difficulty hearing the tone. The more severe the hearing loss the more the dB values on the graph will going downwards. On the X-Axis we have the plotting of the frequency pure tones in hertz (Hz). An Audiogram usually plots sound on the left lowest 125Hz and increasing to 8000 KHz.

Purpose of the Project
The purpose of this project was to develop an application that would help identify common hearing impairment by using an audiogram which would picture how a person hears. The audiogram is used to describe the hearing of a person across different frequencies. It can be used as a tool to determine amount of damage done or determine the cause.

Goal of the Project
The goal of the project was to develop and application to determine hearing impairments in adults.

Motivation
An easy and simple application that will allow users to quickly do hearing tests for an assessment, where they get results by reading the audiogram.

Considerations
The problem of hearing impairment is an increasing problem across most populations of the world and is something that people should be aware of and by taking hearing tests they can be made more aware of their own level of hearing and how to best keep it healthy. In the EU it is estimated that more than 55 million people are having hearing impairments and the costs in the EU is estimated to 160 billion Euros per year. According to a study, a mild hearing loss costs 2000 euros per individual each year, a moderate costs 6,600 and severe 11,000 euros.

From the above studies we can see that it is a problem that needs to be carefully analyzed and more research needs to be done.

Measurement
The advantage brought to users through this application is that it provides free and quick self-test ability. Where-as many Android applications or web applications require connection to the internet using databases, this current project does all processing locally on the users device.






System Implementation


Android Activities
An activity represents a single screen that the user sees on the device. An application usually consists of multiple activities. Activities are the most observable part of the application. In Android, you can be looking at an activity of one application, but shortly after you could start another activity in a completely separate application. For example, if you are in the Calendar application and you decide to call a friend, you would be launching the activity to bring up the phone application in the Calendar application.


Android life-cycle
Activity can be expensive on the device CPU and RAM. It can sometime involve creating a new Linux process, allocating memory for all the new objects, inflating the objects from XML layouts, and setting up the screen. In Android the activity life cycle is managed by the Activity Manager.

Activity Manager is responsible for creating, destroying, and managing activities. For example, when the user starts an application for the first time, the Activity Manager will create its activity and put it onto the screen. Later, when the user switches screens, the Activity Manager will move that previous activity to a holding place. This way, if the user wants to go back to an older activity, it can be started more quickly. Older activities that the user hasn’t used in a while will be destroyed in order to free more space for the currently active one. This mechanism is designed to help improve the speed of the user interface and thus improve the overall user experience.

Programming for Android is conceptually different than programming for some other environments. In Android, you find yourself responding more to certain changes in the state of your application rather than driving that change yourself. It is a managed, container-based environment similar to programming for Java applets or servlets. So, when it comes to an activity life cycle, you don’t get to say what state the activity is in, but you have plenty of opportunity to say what happens during the transitions from state to state. The figure below shows the states that an activity can go through.



Managing the lifecycle of activities by implementing call-back methods is crucial to developing a strong application. The lifecycle of an activity is directly affected by its association with other activities,

Call-back methods are essentially in the three states as follows:

1. Resumed

The activity is in the foreground of the screen and has user focus.

2. Paused

Another activity is in the foreground and has focus, but this one is still visible. That is, another activity is visible on top of this one and that activity partially transparent,

3. Stopped

The activity is completely hidden by another activity. Figure 26 Diagram of how the methods are called for each button



Each button has as onClick() method which the is specified in the XML file. Once button is clicked, an intent object is created which takes instance of current class and class to be launched. Intents provide the ability to bind classes together which helps us to launch them along.

Another important aspect to activities is declaring them in the Android Manifest because if not declared the application will crash with an exception error. In the Manifest we use an activity tag and pass in location of class in the project package




Implementing Frequency Generator
Generating the pure tone requires the use of basic sine wave. In the Frequency Generator class we create an array of generated sound samples and pass it into the AudioTrack.write function. This function takes data to the audio sink for playback in streaming mode. The byte class is used as it wraps primitive value byte in an object. The encoding scheme is PCM 16 bits per sample. Pulse code modulation encodes an audio waveform in the time domain. The frequency and sample rate measures how many samples are plated each second. We use getChannelConfiguration which return the configured channel configuration



The encoding plays an important role as the byte array is in played in form of audio. The audio track write method takes in three parameters one is the byte array this is the audio data, second is the offset bit and last is the size of the bytes.

We create an object of type short which is a 16-bit signed two’s complement integer. The short class wraps a value of primitive type in an object

Configuring the sample rate 44,100 gives the highest quality sound. We test each pure tone and compare if it sounds is as good.




Implementing Audiogram with Android plot library
In order to render the audiogram we require a comprehensive library to create our audiogram. There are two most popular open source libraries available, both were tried and tested, these included AndroidPlot and GraphView.



After testing both GraphView and AndroidPlot, the latter was more useful as it contained helpful examples and its source code was more understandable. To implement AndroidPlot within the project first we have to include it within the project build path.



The main audiogram is implemented into the HearingTestActivity class. This class contains a method to render XYPlot which we associate with out XYPlot which we declared in our audiogram XML file. We create a reference object of type XYPlot this contains information for the plot XY axis and colour of background. This XYPlot class further inherits from XYSeries, XYSeriesFormatter and XYSeriesRenderer. These are all interfaces or abstract classes which provide impartial or no implementation.




Implementing Fragments
Fragments in android can be described as an activity within an activity. Fragments were chosen as a way to combine multiple activities was required which had different classes acting on single screen view. In the design two fragments were added to one layout view that is the HearingTestActivity. In order to implement this, first a layout view was created for the playtone_fragment this is where the user controls the testing options. The other fragment is then of the audiogram which only draws a XYPlot using the Android plot library. The two fragments are then combined into one layout, the reason for doing this is that we can separate the logic of both classes to their own respected layouts and that once we embed them as fragments, the code is less coupled and not totally dependent on other methods .

The implementation of the fragments was one of the trickiest aspects of the project as it took me a long time to implement. There many of ways to implement fragments a there is no best solution so it was trial and error to discover which method worked best. The method that worked for me was where we had fragments tags declared in the hearing test layout and have a class tag defined and also a tools tag for each fragment. The tools tag defines which layout is rendered and is very important.


To complete the implementation of fragments we create two classes for each xml file. These two classes contain an onCreateView method which inflates the layout into the hearingtest_view.xml activity. The inflater instantiates a layout XML file into the View object. The inflation uses XML file at build time. Last thing to make sure is to include the android-support-v4.jar file in the project build path.



Become a member of Team DHOSP

Link to the project repository : https://github.com/JunSoftware109/AndroidProjectHearing

If you would like to be an official contributor contact me at junmalik109@gmail.com.
I will then have a chat with you and welcome you to the team.

Thanks for reading and goodbye for now!


1 comment:

  1. u r app is just giving sound in left and right ear but no audiogram

    ReplyDelete