Takes an "eternity" to run my Python script
I have a Python script which loads binary data from any targeted file and
stores in inside itself, in a list. The problem is that the bigger the
stored file is, the longer it takes to open it the next time. Let's say
that I want to load a 700 MB movie and store it in my script file. Then
imagine that I open it next day with the 700 MB data stored in that
script. It takes an eternity to open it!
Here is a simplified layout of how the script file looks.
Line 1: "The 700 MB movie is stored here inside a list."
Everything below : "All functions that the end-user uses."
Before the interpreter reaches the functions that the user is waiting for
to be called, it has to interpret a 700 MB data that is on line 1 first!
This is ofcourse a problem because who wants to wait for an hour just to
open a script?
So, would it help if I changed the layout of the file like this?:
First lines: "All functions that the end-user uses."
Below : "The 700 MB movie is stored here inside a list."
Would that help? Or would the interpeter have to plow through all the 700
MBs before the functions were called anyways?
No comments:
Post a Comment