For this animation, I intended to walk backwards and forwards slowly on the spot in an attempt to see how much the AI could track. Despite this very simple movement, I found that the AI had struggled to capture my movement even though the camera caught everything perfectly. I began checking the external factors of my environment and didn't see anything wrong with the lighting or environment that could be causing this issue. After replaying my footage, I realized that I was wearing an all-black outfit, and decided to test if this was the issue by changing clothes to make the different parts of my body more distinguishable for the AI.
This is the second animation I created and a re-creation of the first. This time the AI tracked my body more smoothly than before. Nothing in this take had been altered or changed other than my choice of clothes, and this told me that the AI being used had a dislike of dark clothing. Research has shown that AI face recognition struggles to identify black faces compared to white people, which makes me wonder if this is also true for AI body tracking, as this would make the AI biased towards those with lighter skin tones.
For this animation, I wanted to test body-tracking AI's ability to track constantly changing movements by acting out a variety of poses that the AI had to replicate. I moved different parts of my body in a variety of ways to track how well it responded to certain movements and if it favoured certain actions over others. I found that the AI was able to track these movements well, with this being one of my more accurate animations. However, I did find that when performing an action by leaning backwards the AI struggled to comprehend what I was doing, and temporarily began to show inaccuracies. This told me that the AI may be biased towards actions that moved forward and didn't like it when moving objects or people moved further away from the camera. It is possible that a way to fix this issue may be to use multiple cameras to avoid confusing the camera, or to have the camera facing the side of the individual being tracked for this specific pose.
At this point in the project, I decided I wanted to start using objects for my animations. I wanted to test two things by doing this, to see if the AI would track the object that was moving along with the human body, or if it would ignore the moving object and only track the body. I wanted to see how these two scenarios would affect the accuracy of the animated model. For this animation, I wanted to do an everyday activity and decided to start brushing my hair in front of the camera. I found that the AI ignored the hair brush and attempted to only track the movement of the body. The animation was surprisingly accurate to my movements, however, it appears that it replaced the hairbrush with my hand and made the movement appear further away from the rest of the body than what was happening in reality. This told me that AI is more biased against objects than human bodies, but it can still get some form of accuracy despite this.
The second everyday action I decided to perform for this project was reading a book as I believed that this action would be harder for the AI to track and animate. This theory proved to be correct during this animation. During the first half of the movement, I posed with the book to see how well the AI would track my body's poses. After I started to flip through the pages of the book in an attempt to see if the AI would be able to replicate these movements. I found that while the AI was able to process my movements, it was unable to track the majority of important details of how I was moving. Unlike the former model, the movements of the object in this scene were necessary to track to make sense of the rest of my actions. This told me that the AI's bias against objects could affect the accuracy of the final product, making it harder to navigate and make sense of.
One of my main concerns with using AI to track the human body was the effect it may have on disabled bodies. Physical disabilities sometimes mean that people may be unable to move fully without the help of external technology, and I began to wonder how this movement would be tracked if AI motion tracking was biased against non-human movement. For this first attempt, I used a walking stick to showcase how differently those in need of one may move, with the equipment acting as an extension of their body. The result of this project showed that the AI was able to process that I was moving by swaying side to side and slightly shuffling its legs. However, due to the bias the AI has against objects it was unable to process my use of a walking stick, with the end animation remaining on the spot despite me moving in the same way as I did in my walking animations. This result tells me that AI motion tracking is biased against people as well as objects, seemingly favouring the able-bodied person.
After the result of the previous attempt to track a disabled body, I suspected that the tracking of a wheelchair user would be even worse due to a larger amount of equipment being used. However, the AI was surprisingly accurate in detecting the body's movements. The AI successfully gathered that the body was in a sitting position and was able to track the pushing motion needed for movement in a wheelchair. However, it appeared to struggle to animate the movements of the arms, likely due to the AI not being able to process the wheels of the chair due to a bias against non-human motion. This also may be the cause of the animation remaining stationary, unable to track that I was moving backwards and forwards due to this same bias.