Running TFLite Micro Models in PSRAM on an ESP32

Hey,

is there a way to run a model on an ESP32 Microcontroller in PSRAM instead of DRAM? As my model is quite big, and dosn’t fit in the 512kb of DRAM alone, i was wondering if i could use the PSRAM instead.

Any help is appreciated.

Hi @Palettenbrett, As per my knowledge PSRAM is pseudo static RAM, which is a type of DRAM. I think you can use ESP32 PSRAM for running TFLite model. Thank You.

Thank you for your reply,

how exactly can i use the PSRAM for model-inference in code?
What i saw in the examples was:
constantexpr int kTensorArenaSize = 12*1024;
uint8_t tensor_arena[kTensorArenaSize];
But this doesn’t allocate the PSRAM right?
I’m programming in the Arduino IDE.

Thanks in advance.

You should be aware that PSRAM is significantly slower & more power hungry than RAM.

I think Arduino IDE may handle the enabling of the PSRAM for you, but not sure… For PlatformIO & ESP-IDF you need to use the menuconfig. There are a number of options to consider when enabling.

An example linked below of PSRAM, the project uses Edge-Impulse, but that uses TenorFlow under the hood…

  if (inference.buffers[0] == NULL)
    {
        ESP_LOGE(TAG, "Failed to allocate %d bytes for inference buffer", n_samples * sizeof(int16_t));
        return false;
    }

    #ifdef EI_BUFFER_IN_PSRAM
        inference.buffers[1] = (int16_t *)heap_caps_malloc(n_samples * sizeof(int16_t), MALLOC_CAP_SPIRAM);
    #else
        inference.buffers[1] = (int16_t *)malloc(n_samples * sizeof(int16_t));
    #endif

    if (inference.buffers[1] == NULL)
    {
        #ifdef EI_BUFFER_IN_PSRAM
            heap_caps_free(inference.buffers[0]);
        #else
            free(inference.buffers[0]);
        #endif

        return false;
    }

    inference.buf_select = 0;