I encountered this preprint today: https://arxiv.org/abs/2209.11883, in which a biologically plausible learning algorithm for DNNs called "multilayer SoftHebb" is developed. Below is their core formula:

Sec 3 in https://arxiv.org/abs/2209.11883
Sec 3 in https://arxiv.org/abs/2209.11883

This looks very familiar to me so I checked the wiki of Oja's rule and it shows:

Yes I know the SoftHebb added a softmax activation function.