![]() The script loads the query results into a list object called rows, which we can iterate through to do any number of things. Then, we use the psycopg2 library to open that connection and execute a SELECT * query: conn_string = "host='localhost' dbname='test' user='me' password='pw'" After our import statements, we set a connection string to the server. ![]() With open("student_objects.js", "w") as f: # Convert query to objects of key-value pairs With open("student_rowarrays.js", "w") as f: Below, we’ll walk through it step-by-step.Ĭonn_string = "host='localhost' dbname='test' user='me' password='pw'"Ĭursor.execute("SELECT * FROM students_test") One file contains JSON row arrays, and the other has JSON key-value objects. Here’s an example Python script that generates two JSON files from that query. Here’s a script that creates a table and fills it with two rows of data: CREATE TABLE students_test ( Python works well for this, with its JSON encoder/decoder offering a flexible set of tools for converting Python objects to JSON. I’ve gone this route for a few data-driven visuals, creating JSON files out of large database tables. So, you want flat files, each one small for quick loading. Maybe you want to power a data visualization but have neither the time nor the desire to spin up a server to dynamically generate the data. Let’s say you want to generate a few hundred - or even a thousand - flat JSON files from a SQL database. For more of my writing on SQL, check out my book Practical SQL from No Starch Press. This post has been updated from its original 2012 version to use PostgreSQL and the Python psycopg2 library.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |