#2019-06-09-API-data-and-python.markdown
#
#
—
layout: post
title: SQL Server and Python Machine Learning
date: 2020-05-19
description: 2020-05-19-SQL Server and Python Machine Learning
img: py.png # Add image post (optional)
tags: [Machine Learning, SQL Server, Python, Pandas, SciKitLearn]
datatable: true
author: Ian Fogelman # Add name author (optional)
—
Today we will be utalizing a new feature in 2017+ SQL Server enviornments, machine learning via Python native script execution. In this tutorial we will recap how to load a Pickle file into a SQL server database stored as binary data. Retreive that file and load it as a machine learning model and apply the model predictions to our SQL Server data. The steps to achieve our predicts are as follows:
I. Create a Table to hold our binary Models
II. Create a table to hold our machine learning data
III. Create a procedure to load our pickled model into the ML table
IV. Fire stored procedure to load the pickle file
V. Create another stored procedure to take in our SQL server data and model and return a result set
VI. Compare the result set from the model to our data
I. First step is we create a database and a table to store our machine learning logic
II. Next we need to load the Iris data from our csv file into a table in our new database, lets create the table in TSQL and utilize a bulk insert command.
III. Next we must load our persisted model logic from a pickle file. Pickle is a python library that allows you to persist anything to a file on your work station. In this case I trained the model beforehand in a jupyter notebook enviornment and created the pickle file already.
IV. Next we need a stored procedure that will pick up the binary stored in our table and apply the model prediction to our data.
To do this we use a combination of the input_data_1 input_data_name and params keyword arguments for our stored procedure.
V. Now we create a temp table to hold the results from our newly predicting stored proc and join our predicted data back to the original data set and see how it performed.
VI. Now join the results back to the original to assess model performance!
There we have it our first ML model trained in Python, stored in SQL server and executed via a stored procedure. So further interesting applications for this approach would be to build a SSRS report to capture the models predictions to stake holders. Also to keep advancing the models and “check them into” the sql database. We can easily keep a running talley of model performance because all of the models are stored in that table Ex Iris Model V1, Iris Model Winter, Iris Model 2020.
Ian Fogelman
My name is Ian Fogelman. I like to develop data driven solutions with SQL Server, Python, .NET and predictive analytics.