I have a sql dump file containing an entire maria db that I need to query over. the dump is multiple GB big.
Problem is that I don't have access to a local db installation and wont get one due to companies IT-Security restrictions.
Can I iterate and execute my dump via python in sqlite3? I couldn't find a proper explanation on how to do so.
I've used this code snippet to iterate my dump and at least get all table names in return to get an overview over the db:
table_list=[]with open(dmp.file ,encoding='cp437') as f: for line in f: line = line.strip() if line.lower().startswith('create table'): table_name = re.findall('create table `([\w_]+)`', line.lower()) table_list.extend(table_name)for x in table_list: print(x) This worked fine, however in my dump statements for creating tables and so on go over multiple lines, so this approach doesn't work neatly anymore. I've wrote the following to get the statements to one line.
currentLine = ""with open(File,encoding='cp437') as f:for line in f: line = line.strip() currentLine = currentLine +" " + line if line.lower().endswith(';') == True: with open(NewFileOneLiner.txt', "a", encoding="utf-8") as g: g.write(currentLine.lstrip() +'\n') currentLine = ""I'm wondering what additional steps are needed, since it is both sql databases transforming the SQL-statements should be possible theoretically. Is there any way to execute all the statements in sqlite? Where are the boundaries and caveats to this approach? Does sqlite not support some key concepts of SQL that I need to be aware of in this case?