Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 14331

Leverage broadcasting to make this subtraction more efficient

$
0
0

I have an array x of shape (N, T, d). I have two functions f and g which both take an array of shape (some_dimension, d) and return an array of shape (some_dimension, ).

I would like to compute f on all of x. This is simple: f(x.reshape(-1, d)).

I would then like to compute g only on the first slice of the second dimension, meaning g(x[:, 0, :]) and subtract this to the evaluation of f on for all dimensions. This is exemplified in the code

MWE - Inefficient Way

import numpy as np# Reproducibilityseed = 1234rng = np.random.default_rng(seed=seed)# Generate xN = 100T = 10d = 2x = rng.normal(loc=0.0, scale=1.0, size=(N, T, d))# In practice the functions are not this simpledef f(x):    return x[:, 0] + x[:, 1]def g(x):    return x[:, 0]**2 - x[:, 1]**2# Compute f on all the (flattened) arrayfx = f(x.reshape(-1, d)).reshape(N, T)# Compute g only on the first slice of second dimension. Here are two ways of doing sogx = np.tile(g(x[:, 0])[:, None], reps=(1, T))gx = np.repeat(g(x[:, 0]), axis=0, repeats=T).reshape(N, T)# Finally compute what I really want to computediff = fx - gx

Is there a more efficient way? I feel that using broadcasting there must be, but I cannot figure it out.


Viewing all articles
Browse latest Browse all 14331

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>