Logo Haohan's Blog
  • Home
  • About
  • Education
  • Experiences
  • Projects
  • More
    Skills Featured Posts Recent Posts Accomplishments
  • Posts
  • Notes
  • English
    English 简体中文
  • Dark Theme
    Light Theme Dark Theme System Theme
Logo Inverted Logo
  • Posts
  • Examples
    • Introduction
    • Category
      • Sub-Category
        • Rich Content
    • Markdown Sample
    • Shortcodes Sample
  • Deep Learning (Pytorch)
    • Chapter 2 Preparation
    • Chapter 3 Linear Regression
    • Chapter 4 Multilayer Perceptrons
  • Quant
    • Multi-Factors Model
      • Performance Attribution
  • Linear Models
    • Linear Regression
    • Linear Regression and Stats
    • Linear Regression in Quant
      • Conceptions
      • Assumptions
      • Coefficients
      • R Square
      • R Square and Corr
  • Matrix Cookbook
    • Matrix Basic
      • Notations
      • Basics
  • Tips
    • Ubuntu
      • Easyconnect
    • SFTP
    • Vscode remote ssh
Hero Image
SFTP Connect Remote Server

sftp username@remote_url # input password cd # to a directory ls # list the files in that dir put filename # upload "filename" in local to remote server # or put path/your_file # the path of your local file

  • SFTP
  • Remote Server
Tuesday, December 17, 2024 | 1 minute Read
Hero Image
Vscode remote ssh

0.1 Install plugin Search for Remote-SSH in the plugins store, and install it. 0.2 Edit .config Open the .config file. In Windows, the path of this file maybe: C:\user\username\.ssh\.config. In Ubuntu, the path of this file maybe: /home/username/.ssh/.config. You can also find the file via Remote-SSH plugin in vscode, as the picture below: Edit it in vscode: Host alias # customize the name of your remote server HostName hostname # server ip User user # username IdentityFile ~/.ssh/id_rsa# the rsa secrete key if you want to log in the server without password, see section 0.3 Then you can see the remote server under SSH of picture 1, click it and connect to your server.

  • vscode
  • ssh
  • Remote Server
Tuesday, December 17, 2024 | 2 minutes Read
Hero Image
Install Easyconnect in Ubuntu

0.1 Download .deb File Downloading address: https://software.openkylin.top/openkylin/yangtze/pool/all/, search for easyconnect on that page and download easyconnect_7.6.7.3.0_amd64.deb. 0.2 Install sudo dpkg --install easyconnect_7.6.7.3.0_amd64.deb 0.3 Problems 0.3.1 Error Info We can sign in successfully the first time we installed it. But the connection will be failure once we quitted or restarted our computer. Error info: the verison is not match with the server, please upgrade… 0.3.2 Solving Methods 0.3.2.1 Method 1 Delete /usr/share/sangfor/EasyConnect/resources/conf/pkg_version.xml

  • Ubuntu
  • Softwares
  • Easyconnect
Tuesday, December 3, 2024 | 1 minute Read
Hero Image
Questions of R Square and Corr Coef

$y\sim(1,\boldsymbol{x})$, regress $y$ on $x$ with intercept. $y\sim(\boldsymbol{x})$, regress $y$ on $x$ without intercept. In the context of Statistics, SSE (Sum of Squares due to Error) and SSR (Sum of Squares due to Regression) are used more frequently. But in Economitrics, ESS (Explained Sum of Squares) and RSS (Residual Sum of Squares) are prefered. 0.1 Bivariate Regression Denote the $R^2$ of $y\sim(1,x_1)$ as $R_1^2$, $y\sim(1,x_2)$ as $R_2^2$, $y\sim(1,x_1,x_2)$ as $R_3^2$. And we have $corr(x_1,x_2)=\rho$.

  • Linear Model
  • Linear Regression
  • Quant
Tuesday, November 26, 2024 | 3 minutes Read
Hero Image
Questions of R Square

$y\sim(1,\boldsymbol{x})$, regress $y$ on $x$ with intercept. $y\sim(\boldsymbol{x})$, regress $y$ on $x$ without intercept. In the context of Statistics, SSE (Sum of Squares due to Error) and SSR (Sum of Squares due to Regression) are used more frequently. But in Economitrics, ESS (Explained Sum of Squares) and RSS (Residual Sum of Squares) are prefered. 0.1 Definition $$ R^2 = \frac{SSR}{SST} = \frac{||\hat{Y} - \overline{Y}||^2}{||Y - \overline{Y}||^2} = 1 - \frac{SSE}{SST} = 1 - \frac{||\hat{\epsilon}||^2}{||Y - \overline{Y}||^2} $$

  • Linear Model
  • Linear Regression
  • Quant
Tuesday, November 26, 2024 | 3 minutes Read
Hero Image
Questions of Assumptions

$y\sim(1,\boldsymbol{x})$, regress $y$ on $x$ with intercept. $y\sim(\boldsymbol{x})$, regress $y$ on $x$ without intercept. In the context of Statistics, SSE (Sum of Squares due to Error) and SSR (Sum of Squares due to Regression) are used more frequently. But in Economitrics, ESS (Explained Sum of Squares) and RSS (Residual Sum of Squares) are prefered. 0.1 Heteroskedasticity and Autocorrelation If the residuals ($\epsilon$) in a linear regression model exhibit heteroskedasticity (non-constant variance) or autocorrelation (correlation between residuals across observations), how will it impact the estimation and inference of $\beta$? How to test and solve these problems?

  • Linear Model
  • Linear Regression
  • Quant
Tuesday, November 26, 2024 | 7 minutes Read
Hero Image
Basics

0.1 Basic Content $$ \begin{align} (\boldsymbol{AB})^{-1} &= \boldsymbol{B}^{-1}\boldsymbol{A}^{-1} \cr (\boldsymbol{ABC\cdots})^{-1} &= \cdots\boldsymbol{C}^{-1}\boldsymbol{B}^{-1}\boldsymbol{A}^{-1} \cr (\boldsymbol{A}^\top)^{-1} &= (A^{-1})^\top \cr (\boldsymbol{A} + \boldsymbol{B})^\top &= \boldsymbol{A}^\top + \boldsymbol{B}^\top \cr (\boldsymbol{AB})^\top &= \boldsymbol{B}^\top\boldsymbol{A}^\top \cr (\boldsymbol{ABC\cdots})^\top &= \cdots\boldsymbol{C}^\top\boldsymbol{B}^\top\boldsymbol{A}^\top \cr (\boldsymbol{A}^{H})^{-1} &= (\boldsymbol{A}^{-1})^{H} \cr (\boldsymbol{A} + \boldsymbol{B})^H &= \boldsymbol{A}^H + \boldsymbol{B}^H \cr (\boldsymbol{AB})^H &= \boldsymbol{B}^H\boldsymbol{A}^H \cr (\boldsymbol{ABC\cdots})^H &= \cdots\boldsymbol{C}^H\boldsymbol{B}^H\boldsymbol{A}^H \end{align} $$ 0.2 Trace $$ \begin{align} \text{TR}(\boldsymbol{A}) &= \sum_{i}A_{ii} \cr \text{TR}(\boldsymbol{A}) &= \sum_{i}\lambda_{i}, \quad \lambda_{i}=\text{eig}(\boldsymbol{A})_{i} \cr \text{TR}(\boldsymbol{A}) &= \text{TR}(\boldsymbol{A}^\top) \cr \text{TR}(\boldsymbol{AB}) &= \text{TR}(\boldsymbol{BA}) \cr \text{TR}(\boldsymbol{A+B}) &= \text{TR}(\boldsymbol{A}) + \text{TR}(\boldsymbol{B}) \cr \text{TR}(\boldsymbol{ABC}) &= \text{TR}(\boldsymbol{BCA}) = \text{TR}(\boldsymbol{CAB}) \cr \boldsymbol{a}^\top\boldsymbol{a} &= \text{Tr}(\boldsymbol{aa}^\top) \end{align} $$

  • Linear Algebra
  • Matrix
Monday, November 25, 2024 | 2 minutes Read
Hero Image
Notations

With reference to The Matrix Cookbook 0.1 Notation and Nomenclature Notations $\boldsymbol{A}$ $\boldsymbol{A}_{ij}$ $\boldsymbol{A}_i$ $\boldsymbol{A}^{ij}$ $\boldsymbol{A}^{n}$ $\boldsymbol{A}^{-1}$ $\boldsymbol{A}^{+}$ $\boldsymbol{A}^{1/2}$ $(\boldsymbol{A})_{ij}$ $\boldsymbol{A}_{ij}$ $[\boldsymbol{A}]_{ij}$ $\boldsymbol{a}$ $\boldsymbol{a}_i$ $a_i$ $a$ $\mathfrak{R}z$ $\mathfrak{R}\boldsymbol{z}$ $\mathfrak{R}\boldsymbol{Z}$ $\mathfrak{F}z$ $\mathfrak{F}\boldsymbol{z}$ $\mathfrak{F}\boldsymbol{Z}$ $\det(\boldsymbol{A})$ $\text{Tr}(\boldsymbol{A})$ $\text{diag}(\boldsymbol{A})$ $\text{eig}(\boldsymbol{A})$ $\text{vec}(\boldsymbol{A})$ $\text{sup}$ $||\boldsymbol{A}||$ $\boldsymbol{A}^\top$ $\boldsymbol{A}^{-\top}$ $\boldsymbol{A}^{*}$ $\boldsymbol{A}^H$ $\boldsymbol{A}\circ\boldsymbol{B}$ $\boldsymbol{A}\otimes\boldsymbol{B}$ $\boldsymbol{0}$ $\boldsymbol{I}$ $\boldsymbol{J}^{ij}$ $\boldsymbol{\Sigma}$ $\boldsymbol{\Lambda}$ Nomenclature Matrix Matrix indexed for some purpose Matrix indexed for some purpose Matrix indexed for some purpose

  • Linear Algebra
  • Matrix
Monday, November 25, 2024 | 2 minutes Read
Hero Image
Linear Regression and Stats

This post focuses on Ordinary Linear Regression. 0.1 Simple Linear Regression The most basic version of a linear model is Simple Linear Regression, which can be expressed by this formular: $$ y = \alpha + \beta \times x + \epsilon $$ where $\alpha$ is called intercept, $\beta$ is called slope, and $\epsilon$ is called residual. The coefficients of Simple Linear Regression can be solved using Least Squres Method, by minimizing $\sum_{i=1}^{n}(y_i-\hat{y}_i)^2$.

  • Linear Model
  • Linear Regression
Sunday, November 17, 2024 | 2 minutes Read
Hero Image
Linear Regression

0.1 General Expression $$y_{i}=\beta_{0}+\beta_{1}\times x_{i1}+\cdots+\beta_{p}\times x_{ip}+\epsilon_{i},\quad i=1,2,\cdots,n$$ $$ \begin{align*} \mathbf{y}&=(y_{1},y_{2},\cdots,y_{n})^{T} \cr \mathbf{X}&=\begin{bmatrix}1 & x_{11} & x_{12} & \cdots & x_{1p} \cr 1 & x_{21} & x_{22} & \cdots & x_{2p} \cr \vdots & \vdots & \vdots & \vdots & \vdots \cr 1 & x_{n1} & x_{n2} & \cdots & x_{np} \end{bmatrix} \cr \mathbf{\beta}&=(\beta_{0},\beta_{1},\cdots,\beta_{p})^{T} \cr \mathbf{\epsilon}&=(\epsilon_{1}, \epsilon_{2},\cdots,\epsilon_{n})^{T} \end{align*} $$ 0.2 OLS Assumptions The regression model is parametric linear. ${x_{i1},x_{i2},\cdots,x_{ip}}$ are nonstochastic variables. $E(\epsilon_{i})=0$. $Var(\epsilon_{i})=\sigma^{2}$. ${\epsilon_{i}}$ are independent random variables, so as to say: no autocorrelation, $cov(\epsilon_{i},\epsilon_{j})=0,i\neq j$. The regression model is set correctly, without setting bias. 0.3 OLS Estimators 0.3.1 Estimators of $\hat{\beta}$ Formally, the OLS estimator of $\beta$ is defined by the minimizer of the residual sum of squares (RSS): $$\hat{\mathbf{\beta}}=arg\ min_{\beta}\ S(\mathbf{\beta})$$ $$S(\mathbf{\beta})=(\mathbf{y}-\mathbf{X\beta})^{T}(\mathbf{y}-\mathbf{X\beta})=\sum\limits_{i=1}^{n}(y_{i}-\beta_{0}-\beta_{1}\times x_{i1}-\cdots-\beta_{p}\times x_{ip})^{2}$$ Derive it we can get: $$\hat{\mathbf{\beta}}=(\mathbf{X^{T}X})^{-1}\mathbf{X^{T}y}$$

  • Linear Model
  • Linear Regression
Sunday, November 17, 2024 | 5 minutes Read
Hero Image
Questions of Coefficients

$y\sim(1,\boldsymbol{x})$, regress $y$ on $x$ with intercept. $y\sim(\boldsymbol{x})$, regress $y$ on $x$ without intercept. In the context of Statistics, SSE (Sum of Squares due to Error) and SSR (Sum of Squares due to Regression) are used more frequently. But in Economitrics, ESS (Explained Sum of Squares) and RSS (Residual Sum of Squares) are prefered. 0.1 Product of $\beta$ Denote $\beta_1$ as the least squares optimal solution of $y=\beta x+\epsilon$, $\beta_2$ as the least squares optimal solution of $x=\beta y+\epsilon$. Find the min and max values of $\beta_1\beta_2$. $$ \beta_1 = \frac{Cov(X,Y)}{Var(X)},\quad \beta_2 = \frac{Cov(X,Y)}{Var(Y)}\Rightarrow \beta_1\beta_2 = \rho_{XY}^2 \in [-1,1] $$

  • Linear Model
  • Linear Regression
  • Quant
Saturday, November 16, 2024 | 4 minutes Read
Hero Image
Questions of Conceptions

With reference to donggua. I complete the answers of these questions. 0.1 Notations $y\sim(1,\boldsymbol{x})$, regress $y$ on $x$ with intercept. $y\sim(\boldsymbol{x})$, regress $y$ on $x$ without intercept. In the context of Statistics, SSE (Sum of Squares due to Error) and SSR (Sum of Squares due to Regression) are used more frequently. But in Economitrics, ESS (Explained Sum of Squares) and RSS (Residual Sum of Squares) are prefered. 0.2 Conceptions and Basic Definitions 0.2.1. The assumptions of LR Gauss-Markov Theory: Under the assumptions of classical linear regression, the ordinary least squares (OLS) estimator is the linear unviased estimator with the minimum variance. (BLUE)

  • Linear Model
  • Linear Regression
  • Quant
Saturday, November 16, 2024 | 5 minutes Read
  • ««
  • «
  • 1
  • 2
  • »
  • »»
Navigation
  • About
  • Education
  • Experiences
  • Projects
  • Skills
  • Featured Posts
  • Recent Posts
  • Accomplishments
Contact me:
  • zhhohoh27@gamil.com
  • online727
  • Haohan Zhao
  • +86 19551998168

Liability Notice: This site is built on hugo and github pages and uses the hugo-toha theme. The site is used for personal blogging, all content is owned by me and does not constitute any relevant advice, if you have any questions, please contact me!


Toha Theme Logo Toha
© 2020 Copyright.
Powered by Hugo Logo