Export A Dataframe Into Mssql Server As A New Table
I have written a Code to connect to a SQL Server with Python and save a Table from a database in a df. from pptx import Presentation import pyodbc import pandas as pd cnxn = pyodbc
Solution 1:
You can use df.to_sql() for that. First create the SQLAlchemy connection, e.g.
from sqlalchemy importcreate_engineengine= create_engine("mssql+pyodbc://scott:tiger@myhost:port/databasename?driver=SQL+Server+Native+Client+10.0")
See this answer for more details the connection string for MSSQL.
Then do:
df.to_sql('table_name', con=engine)
This defaults to raising an exception if the table already exists, adjust the if_exists
parameter as necessary.
Solution 2:
This is how I do it.
# Insert from dataframe to table in SQL Serverimport time
import pandas as pd
import pyodbc
# create timer
start_time = time.time()
from sqlalchemy import create_engine
df = pd.read_csv("C:\\your_path\\CSV1.csv")
conn_str = (
r'DRIVER={SQL Server Native Client 11.0};'r'SERVER=ServerName;'r'DATABASE=DatabaseName;'r'Trusted_Connection=yes;'
)
cnxn = pyodbc.connect(conn_str)
cursor = cnxn.cursor()
for index,row in df.iterrows():
cursor.execute('INSERT INTO dbo.Table_1([Name],[Address],[Age],[Work]) values (?,?,?,?)',
row['Name'],
row['Address'],
row['Age'],
row['Work'])
cnxn.commit()
cursor.close()
cnxn.close()
# see total time to do insertprint("%s seconds ---" % (time.time() - start_time))
Post a Comment for "Export A Dataframe Into Mssql Server As A New Table"