Code works in Jupyter but won't run in Anaconda

I have a problem with some code I have written. I have spent a long time getting a very long script working in Jupyter notebook and testing it to make sure it is fully working.

When I was happy with it, I downloaded as a .py file so I could run it from an Anaconda terminal. There is a function in the code that works fine in Jupyter, but is failing in Anaconda. Can anybody explain why this is or how I can re-write the code to work as expected?

The function is applied to a Pandas dataframe to look at it row by row and populate a new column. I defined 2 global variables that are referenced within the function and updated by each iteration. They need to be global, otherwise the function would overwrite them incorrectly on every row.

The function looks at the row in the dataframe, and if the value in the specified column, matches the first variable, it populates the new column with the sum of the second variable and 2 other columns in the dataframe. Then it updates the second variable to this new value so it can be referenced on the next row.

If the value doesn’t match the first variable, it does the same sum but ignores the second variable on this iteration. It then sets both variables based on this row ready for the next iteration.

Here is the code I used:

#Function that will work out the post transaction stock balance
part = numpy.int64(0)
bal = numpy.float64(0)

def balance_calculation(vec):
    global part
    global bal
    if vec[0] == part:
        variance = vec[3] - vec[4]
        bal += variance
        part = vec[0]
        return bal
        bal = vec[3] - vec[4]
        part = vec[0]
        return bal

As mentioned, this works fine in Jupyter, but in Anaconda I am getting this error:

NameError: name ‘part’ is not defined

Is this all of your code?
When you ran the code in Jupyter, did you also create a dataframe from the csv file (after importing the csv file-- df=pd.read_csv() ? Did you import the csv to the Anaconda space?