Global Modeling with GluonTS DeepAR: Future of Semiconductors in the U.S.

Bernstein has conducted an analysis of the U.S. supply and demand balance in analog and discrete semiconductors, particularly in light of the potential introduction of Section 232 tariffs. The analysis focuses on the implications for major companies, including Texas Instruments, Analog Devices, Infineon Technologies and Renesas.

According to analysts led by David Dai, if the U.S. successfully brings more manufacturing of end applications back onshore, demand for these products may rise even further. However, currently, there isn’t a significant gap that would lead to a substantial increase in the capacity for analog and discrete manufacturing in the U.S.

Infineon and Renesas have the greatest risk of exposure to potential tariffs due to their relatively small production footprint in the U.S., while Texas Instruments and Analog Devices, being U.S.-based, appear to be well-positioned.

The chart below indicates that all the aforementioned stock prices are under the point forecast line of the DeepAR model, but they have potential room to rise if they adapt to the supply chain shifts.

Source code:

library(tidyverse)
library(tidymodels)
library(tidyquant)
library(timetk)
library(modeltime)
library(modeltime.gluonts)

#Texas Instruments Incorporated (TXN)
df_txn <- 
  tq_get("TXN") %>% 
  select(date, 'Texas Instruments' = close)

#Analog Devices, Inc. (ADI)
df_adi <- 
  tq_get("ADI") %>% 
  select(date, 'Analog Devices' = close)

#Infineon Technologies AG (IFNNY)
df_ifnny <- 
  tq_get("IFNNY") %>% 
  select(date, Infineon = close )

#Renesas Electronics Corporation (6723.T)
df_renesas <- 
  tq_get("6723.T") %>% 
  select(date, Renesas = close)


#Merging the datsets
df_merged <- 
  df_txn %>% 
  left_join(df_adi) %>% 
  left_join(df_ifnny) %>% 
  left_join(df_renesas) %>% 
  drop_na() %>% 
  filter(date >= last(date) - months(12)) %>% 
  pivot_longer(-date,
               names_to = "id",
               values_to = "value") %>% 
  mutate(id = as_factor(id)) 



#Split Data 
split <- time_series_split(df_merged,
                           assess = "15 days",
                           cumulative = TRUE)



df_train <- training(split)
df_test <- testing(split)


#Fit a GluonTS DeepAR Model
model_fit_deepar <- deep_ar(
  id                    = "id",
  freq                  = "D",
  prediction_length     = 11,
  lookback_length       = 22,
  epochs                = 10
) %>%
  set_engine("gluonts_deepar") %>%
  fit(value ~ ., df_train)

#Modeltime Table
model_tbl <- 
  modeltime_table(
    model_fit_deepar
  )


#Calibrate by ID
calib_tbl <- 
  model_tbl %>%
  modeltime_calibrate(
    new_data = df_test, 
    id       = "id"
  )

#Measure Test Accuracy

#Global Accuracy
calib_tbl %>% 
  modeltime_accuracy(acc_by_id = FALSE) %>% 
  table_modeltime_accuracy(.interactive = FALSE)

#Local Accuracy
calib_tbl %>% 
  modeltime_accuracy(acc_by_id = TRUE) %>% 
  table_modeltime_accuracy(.interactive = TRUE)

#Prediction Intervals
calib_tbl %>%
  modeltime_forecast(
    new_data    = df_test,
    actual_data = df_merged %>% filter(date >= as.Date("2025-07-31")),
    conf_by_id  = TRUE
  ) %>%
  group_by(id) %>%
  plot_modeltime_forecast(
    .facet_ncol  = 2,
    .interactive = FALSE,
    .line_size = 1
  )  +
  labs(title = "Global Modeling with Deep Learning Model", 
       subtitle = "<span style = 'color:dimgrey;'>Predictive Intervals</span> of <span style = 'color:red;'>DeepAR Model</span> Model", 
       y = "", x = "") + 
  scale_y_continuous(labels = scales::label_currency()) +
  scale_x_date(labels = scales::label_date("%b %d"),
               date_breaks = "3 days") +
  theme_tq(base_family = "Roboto Slab", base_size = 16) +
  theme(plot.subtitle = ggtext::element_markdown(face = "bold"),
        plot.title = element_text(face = "bold"),
        plot.background = element_rect(fill = "snow"),
        strip.text = element_text(face = "bold", color = "black"),
        strip.background = element_rect(fill =  "azure"),
        axis.text.x = element_text(angle = 45, hjust = 1, vjust = 1),
        legend.position = "none")

One response to “Global Modeling with GluonTS DeepAR: Future of Semiconductors in the U.S.”

  1. Foma Ovolevor Avatar

    Nice work Selcuk , To aid the loading of modeltime.gluonts library, the package link : https://github.com/business-science/modeltime.gluonts and the installation guide below will be helpful:

    #Install GitHub Version

    remotes::install_github(“business-science/modeltime.gluonts”)

    #GluonTS Installation – Run 1st time

    install_gluonts(fresh_install = FALSE, include_pytorch = FALSE)

    #2nd run

    library(modeltime.gluonts)
    install_gluonts(fresh_install = TRUE)

    Liked by 1 person

Leave a reply to Foma Ovolevor Cancel reply

I’m Selcuk Disci

Welcome to DataGeeek.com, dedicated to data science and machine learning with R, mostly based on financial data.

Let’s connect