Skip to main content

Hi,


I’m a student researcher for my team (design, dev, ml ops). 🥷


I’ve been tasked to look into how our UX designers can show interaction with ML models (eg: getting result from pre-trained models from hugging face or Open AI APIs) in our existing Figma prototype. It’s for building ML-based web applications with potentially our own or open source ML models.


Has anyone faced similar challenge? How did you go about it? Any workarounds/resources I can look into?


I’m at exploratory phase so I’m open to any ideas & advice.


Thanks,

Jo


#helpastudentout

Since Figma prototypes can’t embed anything or even dynamically change while you are interacting with them, there isn’t much you can do other than just mock up some sample interactions and give the user a choice from a limited number of options instead of letting them input anything they want (which isn’t possible in Figma in the first place).


Thanks Gleb!


Reply