I am trying to write a function that has a sympy symbol with a default value, but am not getting what I expect.
Here is my code
import sympy as sydef addtwo(y = sy.symbols('y', reals = True)): return y + 2y = sy.symbols('y')x = addtwo()print(type(x))print(x)print(x.subs(y,1))
with output
<class 'sympy.core.add.Add'>y + 2y + 2
However, the code
y = sy.symbols('y')x = addtwo(y)print(x)print(x.subs(y,1))
gives the expected output
y + 23