top of page

Feb. 27th-Mar. 5th Update on Facical Expression

During these two weeks, my task is to update the FaceExpressionManager script. From our test, I find out that the reset blendshape method in FaceExpressionManager script would affect the changes of face feature which would reset all the weight as the changes made in face feature process. This is a one problem I need to solve. Another update is that we need a new scriptable object to store the skin color and normal maps as texture for the head model. During our discussion, I have an idea of using input text as trigger to change facial exprssion as the change in character's emotion state.


Problem 1: Update reset blendshape weight method

In privous FaceExpressionManager setup, the method of reset blendshape will reset all the blendshape weight when the face expression changed. And I need to update this method to make sure, the reset did not affect the changes in face featuring process.

In order to do this, I need to create a method that FaceExpressionManager script will get the change data in face features and keep it.

First I get the FaceFeature script in it

Second I updated the reset method to make sure it receives the changes from FaceFeatureController

Third I create a method called ResetSettingValue in FaceFeatureController

And the problem prefectly solved, changes in face expression keeps the change of face feature.


MaterialProperty

Since we also want to change the skin color and texture (Nomarl Map) of the head, so I created a new scriptable object class to store those value as presets

By this scripts, we could store different skin color and texture, and use those presets in other scripts.


Problem3: using input text as trigger to change face expression (Unsolved)

I have the idea to use text as trigger to change face expression and it would be more realistic and help our UI pannel clear. The reason why using input text as trigger is that I thought input by player would have better control on the emotional change, since the response data might reveal this emotion without mention this keyword directly.


First step is that I need to find out where Convai put their input text data, and from their documentation, it shows there is a script called ConvaiTextInOut script that handles this job, but I could not find it in their unity plugin package.

And I tried to set the method in their Subtitle Chat UI script as getting input text data, and use the Setexpression method in FaceExpressionManager script

But it did not work, so this script did not handle the input text data, and I tried another most likely one called Question Answer UI

This did not work either, even the Debug log message did not show up


So that I need to find the real script that handles input text data, but I could not find it in their package, and I asked Convai team via discord, still have not get any response from them.


The next thing to do: maybe tried if their team could provide me where they handled input text data, once I could find the access to input text data I could deliver the facial expression method into it, but now the trigger is still the bottom.

bottom of page