Session
"Building AI-Powered Analytics: From R Fundamentals to Local LLMs and Interactive Dashboards"
Workshop
Master the complete analytics workflow from data exploration to AI-powered interactive applications in this comprehensive hands-on workshop. Learn R programming fundamentals, exploratory data analysis with modern libraries (tidyverse, janitor, funModeling), and when to blend with Python using reticulate. Build professional Shiny dashboards with bslib, then extend them with AI capabilities using local models via Ollama and elmer/shinychat packages. Discover how to run LLMs locally for data analysis, create AI-enhanced dashboards, and build intelligent chatbots within your Shiny apps—all without cloud dependencies or API costs.
This workshop is for analysts, data scientists, and developers ready to integrate AI into their analytics stack while maintaining control over their own code and data infrastructure. Participants will leave with working code examples and practical strategies for deploying AI-powered analytics applications. Prerequisites: Basic programming experience helpful but not required; laptops with R and RStudio/Positron IDE installed.
Detailed 6-Hour Workshop Structure (Tentative Schedule -- can refine for conference or update steps with libraries as necessary)
Module 1: R Fundamentals & Environment Setup (60 minutes)
Topics:
Workshop overview and learning objectives
R and RStudio/Positron IDE setup verification
R basics: data types, vectors, data frames, lists
Installing and loading packages
Working directory and project organization
Hands-On Activities:
Create first R project
Install essential packages: tidyverse, here, janitor, funModeling (libraries may change before final schedule is established)
Load sample dataset (retail/marketing data)
Basic data manipulation: filtering, selecting, mutating
Deliverable: Working R project with organized file structure
Module 2: Exploratory Data Analysis (EDA) with R (75 minutes)
Topics:
EDA philosophy and workflow
Data quality assessment with janitor package
Automated EDA with funModeling::df_status()
Identifying outliers, missing data, and distributions
Data visualization with ggplot2
Statistical summaries and correlations
Feature engineering basics
R Libraries Covered: (may change before final deck is complete)
janitor - data cleaning and tabulation
funModeling::df_status() - quick data profiling
ggplot2 - visualization
dplyr - data transformation
skimr - summary statistics
Hands-On Activities:
Clean messy dataset using janitor (may change wit
Run df_status() to identify data quality issues
Create exploratory visualizations
Engineer new features from existing variables
Document findings in R Markdown
Deliverable: Complete EDA report identifying data patterns and quality issues
BREAK: 15 minutes
Module 3: Blending R and Python for Analytics (60 minutes)
Topics:
The polyglot analytics stack philosophy
When to use R vs. Python (and why)
Introduction to reticulate package
Calling Python from R
Data interchange between languages
Use case: Python for data engineering, R for analysis
Math/Code/Domain separation architecture
Python Integration:
Setting up Python environment for R
Using pandas in R via reticulate
Passing data frames between R and Python
Leveraging Python ML libraries (scikit-learn) from R
Hands-On Activities:
Install and configure reticulate
Import Python libraries
Load data with Python pandas, analyze in R
Create hybrid workflow: Python preprocessing → R visualization
Practice calling Python functions from R code
Deliverable: Hybrid R/Python analysis script demonstrating language interoperability
LUNCH BREAK: 60 minutes
Module 4: Building Interactive Dashboards with Shiny (75 minutes)
Topics:
Shiny architecture: UI and Server
Building reactive applications
Layout with bslib (modern dashboard design)
Interactive widgets: inputs and outputs
Reactive programming concepts
Shiny modules for reusable components
Performance optimization and caching with memoise
Deployment options (ShinyApps.io, Posit Connect, ShinyLive)
Shiny Features Covered (may change before final deck is complete)
bslib - modern UI components, tooltips, popovers
shinyjs - JavaScript integration
plotly/ggplotly - interactive visualizations
Reactive expressions and observers
Caching with renderCachedPlot() and memoise
Hands-On Activities:
Create basic Shiny app with input controls
Build dashboard with bslib cards and layouts
Add interactive plots with plotly
Implement caching for expensive calculations
Create reusable Shiny module
Test app locally
Deliverable: Functional interactive dashboard displaying analytics results
BREAK: 15 minutes
Module 5: Local LLMs and AI Integration (60 minutes)
Topics:
Introduction to local LLMs (Ollama)
Why use local models vs. cloud APIs
Setting up Ollama on your machine
Available models: llama3, mistral, phi, etc.
R packages for LLM integration: elmer, rollama
Prompt engineering for data analysis
Using LLMs for:
Data exploration and pattern identification
Code generation and debugging
Automated EDA narration
Feature engineering suggestions
Technical Setup:
Installing Ollama
Pulling local models
Installing elmer package
Configuring model connections
Hands-On Activities:
Install Ollama and download models (llama3, mistral)
Connect R to local LLM using elmer
Send prompts from R scripts
Use LLM to analyze dataset
Generate code for visualizations using AI
Debug R code with AI assistance
Create automated data summary using LLM
Deliverable: R script that uses local LLM for automated data analysis and insights
Module 6: AI-Powered Shiny Apps with shinychat (60 minutes)
Topics:
Introduction to shinychat package
Building chatbots in Shiny apps
Integrating local LLMs into dashboards
Chat interface design patterns
Context management for conversations
Use cases:
Data exploration assistant
Natural language query interface
Automated report generation
Interactive data Q&A
Architecture:
Shiny UI + Chat interface
Server logic connecting to local LLM
Streaming responses
Conversation history management
Tool calling from chat (e.g., "plot this data")
Advanced Topics:
Retrieval Augmented Generation (RAG) concepts
Embedding your data documentation for LLM context
Agentic workflows: LLM decides which analysis to run
Hands-On Activities:
Install and configure shinychat
Create basic chat interface in Shiny
Connect chat to Ollama local model
Build data exploration chatbot
Enable LLM to query data and generate plots
Implement conversation memory
Deploy complete AI-powered dashboard
Deliverable: Working Shiny app with integrated AI chatbot for data analysis
Module 7: Integration, Deployment & Best Practices (45 minutes)
Topics:
Bringing it all together: Complete workflow
Project organization and documentation
Version control with Git
R package development basics (creating your own analytics packages)
Deployment strategies:
ShinyApps.io (cloud)
Posit Connect (enterprise)
ShinyLive (browser-based, no server)
Docker containers
Performance optimization
Production readiness checklist
Golem framework for production Shiny apps
Best Practices:
Readable code and documentation
Error handling and logging
Testing Shiny apps
Monitoring and maintenance
Security considerations with local LLMs
Hands-On Activities:
Organize complete project structure
Create GitHub repository
Document code and analysis
Prepare app for deployment
Test on ShinyLive (free hosting option)
Deliverable: Production-ready, documented analytics project with deployment plan
Wrap-Up & Q&A (15 minutes)
Topics:
Workshop recap and key takeaways
Resources for continued learning
Community resources (R-Ladies, Posit Community)
Advanced topics to explore
Open Q&A
Resources Provided:
Complete GitHub repository with all workshop code
Curated list of R packages and tutorials
Ollama model recommendations
Deployment guides
Sample datasets for practice
Slides and reference materials
Technical Requirements
Participants Should Have Installed:
R (version 4.3+)\
RStudio or Positron IDE (latest version)
Ollama (with at least one model downloaded, e.g., llama3)
Git (for version control)
Python (3.9+) with pandas (for Module 3)
Laptop with 8GB+ RAM (16GB recommended for local LLMs)
10GB free disk space (for Ollama models)
Stable internet connection
Learning Outcomes
By the end of this workshop, participants will be able to:
Conduct thorough exploratory data analysis using modern R libraries
Integrate R and Python in a single analytics workflow using reticulate
Build interactive dashboards with Shiny and modern UI components
Run local LLMs for data analysis without cloud dependencies
Create AI-powered applications that combine analytics with conversational interfaces
Deploy analytics projects
Follow best practices for production-ready analytics code
Please note that Sessionize is not responsible for the accuracy or validity of the data provided by speakers. If you suspect this profile to be fake or spam, please let us know.
Jump to top