Skip to content

[BUG] Markdown rendering problem #790

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
1 task
atigun64 opened this issue Apr 12, 2025 · 2 comments
Open
1 task

[BUG] Markdown rendering problem #790

atigun64 opened this issue Apr 12, 2025 · 2 comments
Labels
type: bug Something isn't working

Comments

@atigun64
Copy link

atigun64 commented Apr 12, 2025

Environment

https://v2-dev.big-agi.com/

Description

Image

Device and browser

Device: Asus Zephyrus G14 OS: Windows 11 Browser: Firefox

Screenshots and more

No response

Willingness to Contribute

  • 🙋‍♂️ Yes, I would like to contribute a fix.
@atigun64 atigun64 added the type: bug Something isn't working label Apr 12, 2025
@enricoros
Copy link
Owner

Thanks @atigun64 can you paste the answer also so I can see what's unexpected in the code?

@atigun64
Copy link
Author

atigun64 commented Apr 12, 2025

Thanks @atigun64 can you paste the answer also so I can see what's unexpected in the code?

# Converting Powers of a Matrix Between Bases

You're absolutely correct, and you've touched on one of the most powerful applications of change of basis in linear algebra!

## Diagonalization and Matrix Powers

When $S$ is a matrix whose columns are the eigenvectors of $A$, then $D = S^{-1}AS$ is a diagonal matrix with the eigenvalues of $A$ on the diagonal. This is called the **diagonalization** of $A$.

The key insight is that powers of a diagonal matrix are very easy to compute - you just raise each diagonal element to the power:

$$D^k = \begin{pmatrix} 
\lambda_1^k & 0 & \cdots & 0 \\
0 & \lambda_2^k & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n^k
\end{pmatrix}$$

## Converting Powers Back to Standard Basis

To find $A^k$ in the standard basis, you can indeed use:

$$A^k = S D^k S^{-1}$$

This works because:

$$A^k = (SDS^{-1})(SDS^{-1})\cdots(SDS^{-1}) = SD^kS^{-1}$$

The $S$ and $S^{-1}$ in the middle "cancel out" leaving just the outer $S$ and $S^{-1}$.

## Why This Works

This approach is powerful because:
1. Computing $D^k$ is trivial (just raise eigenvalues to power $k$)
2. You only need to compute $S$ and $S^{-1}$ once
3. It works for any power $k$, including non-integer powers in some cases

## Example

Suppose $A$ has eigenvalues $\lambda_1 = 2$ and $\lambda_2 = 3$ with corresponding eigenvectors forming the columns of $S$.

Then:
- $D = \begin{pmatrix} 2 & 0 \\ 0 & 3 \end{pmatrix}$
- $D^k = \begin{pmatrix} 2^k & 0 \\ 0 & 3^k \end{pmatrix}$
- $A^k = SD^kS^{-1}$

This is much easier than computing $A·A·A·...·A$ directly $k$ times!

So yes, your formula $A^k = S(S^{-1}AS)^kS^{-1} = SD^kS^{-1}$ is exactly right!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants