How To Display Image From Raw Bytes In Plugin

Hi all,

I am currently trying to display images that are taken from the fills of a selected object within a plugin. I am following this guide, where I get the image by hash and obtain the image bytes array.

const getImageByHash = async (hash: string) => {
    console.log('Image found');
    const imageHash = figma.getImageByHash(hash);
    const imageArray = await imageHash.getBytesAsync();
    console.log('Posting image byte array to Figma...');
    figma.ui.postMessage({type: 'renderImage', img: imageArray});

I am trying to then display the image in the plugin, but haven’t been successful in figuring out how to do so.

I’ve tried converting the image to a Blob url and using that as the image src but it doesn’t seem to work.

const blob = new Blob([texture], {type: 'image/png'});
            const blobURL = URL.createObjectURL(blob);
            const image = await new Promise((resolve, reject) => {
                const img = new Image();
                img.onload = () => resolve(img);
                img.onerror = () => reject();
                img.src = blobURL;

I am using React for the Figma plugin frontend. Any advice for this would be greatly appreciated!

Create a canvas element in your component, use useRef to get a reference to it, use useEffect to draw your image on the canvas. At least, that’s how I do it.

export type CanvasImageProps = {
    width: number
    height: number
    bytes: Uint8Array

export function CanvasImage(props: CanvasImageProps): JSX.Element {
    const canvas = useRef(null)

    useEffect(() => {
        if( canvas === null || canvas.current === null ) return
        const c = canvas.current as HTMLCanvasElement
        const ctx: CanvasRenderingContext2D | null = c.getContext('2d')
        if( ctx === null ) return

        const loadImage = async () => {
            const blob = new Blob([Uint8Array.from(props.bytes)])
            const url = URL.createObjectURL(blob)
            const image: CanvasImageSource = await new Promise((resolve, reject) => {
                const img = new Image()
                img.onload = () => resolve(img)
                img.onerror = () => reject(new Error("Could not decode bytes due to an error"))
                img.src = url
            const hRatio = ctx.canvas.width / (image.width as number)
            const vRatio = ctx.canvas.height / (image.height as number)
            const ratio  = Math.min( hRatio, vRatio );
            ctx.drawImage(image, 0,0, image.width as number, image.height as number, 0,0,(image.width as number)*ratio, (image.height as number)*ratio);
    }, []);

    return (
        <canvas ref={canvas} width={props.width} height={props.height}/>