user32882
user32882

Reputation: 5877

Substituting matrix for a scalar in SymPy

Here's what I'm doing in a SymPy session:

from sympy import *
xi1,xi2,xi3 = symbols('xi_1,xi_2,xi_3')
N1 = 1-xi1-xi2-xi3
N2 = xi3
N3 = xi1
N4 = xi2
x1,x2,x3,x4 = symbols('x_1, x_2, x_3, x_4')
x = N1*x1+N2*x2+N3*x3+N2*x4
subdict = {x1:Matrix([0.025,1.0,0.0]), x2 : Matrix([0,1,0]), x3:Matrix([0, 0.975, 0]), x4:Matrix([0,0.975,0.025])}
x.subs(subdict)
test.subs({xi1:1, xi2:0,xi3:0})

To me we are simply multiplying some scalars with some vectors then adding them up. However SymPy would disagree and throws a ginormous error for which the last line is:

TypeError: cannot add <class 'sympy.matrices.immutable.ImmutableDenseMatrix'> and <class 'sympy.core.numbers.Zero'>

Why is this a problem? Is there a workaround for what I am trying to do?

Upvotes: 1

Views: 927

Answers (1)

smichr
smichr

Reputation: 19057

I suspect that what is happening is that before the matrix is substituted you substitute a 0 which makes 0*matrix_symbol = 0 instead of a matrix of zeros. Terms that end up being matrices cannot be added to 0 and thus the error. My attempts at using the simultaneous flag or xreplace instead of subs give the same result (on sympy.live.org). Then I tried to do the substitutions in reversed order, passing them as a list with the matrices first. Still didn't work. It looks like subs assumes that 0*foo is 0. An issue at sympy issues should be raised if there is not already an existing issue.

The workaround is to do the scalar substitutions first, allowing the zero terms to disappear. Then do a subs with the matrices. So this will require 2 calls to subs.

A true, hackish workaround for doing substitution with 0 is this:

def remul(m):
  rv = 1
  for i in m.args:
    rv *= i
  return rv

expr = x*y
mat = expr.subs(x, randMatrix(2)) # replace x with matrix
expr = mat.replace( # replace y with scalar 0
    lambda m: m.is_Mul,
    lambda m: remul(Mul(*[i.subs(y, 0) for i in m.args], evaluate=False)))

Upvotes: 2

Related Questions