Matlab代写,计算机作业代写,高分原创代码

什么是Gradient Descent?

假设你在玩一个游戏,玩家在山顶上,他们被要求到达山的最低点。此外,他们还被蒙上了眼睛。那么,你认为什么方法能让你到达山脚?

最好的方法是观察地面,找到土地下降的位置。从这个位置开始,向下降的方向迈出一步,反复这个过程,直到我们到达最低点。梯度下降是一种迭代优化算法,用于寻找函数的局部最小值。

梯度下降算法的目标是最小化给定函数(比如成本函数)。为了实现这个目标,它迭代地执行两个步骤:

  • 计算梯度(斜率),即函数在该点的一阶导数
  • 在与梯度相反的方向上迈出一步(移动),斜率的相反方向从当前点增加 alpha 乘以该点的梯度

essay 代写

梯度下降的最优化的广泛应用

机械学:设计航空航天产品的表面;

经济学:成本最小化;

物理学:量子计算中的优化时间;

决定最佳运输路线,货架空间优化等等。 许多流行的机器算法都依赖于线性回归,k最近邻,神经网络等技术。优化的应用是无限的,因此它成为了学术界和工业界广泛研究的课题。在本文中,我们将介绍一种称为梯度下降(Gradient Descent)的优化技术。 这是机器学习时最常用的优化技术。

计算机作业代写:使用GD、SGD、ALS方法优化矩阵分解

Two datasets are given for you to test your model performance.

  1. Optimize Matrix Factorization using GD, SGD, ALS method, using the same set of hyperparameters.
  2. Please, visualize the loss history with respect to the iterations to ensure your model is converged within num_iter , the number of iteration you set.
  3. Then, report final loss for all methods, and training time. And write a short explanation for your results.
  4. set the number of latent factor be 2. ### Visualization Requirement:
  5. Please use the same learning rate for both GD and SGD. and choose a relatively suitable value for both of them.
  6. Display the training curve in the same graph. Using different colors to represent different optimization method.
  7. Please include a legend.

1. Toy example

Apart from the general requirements from Part 3, print the matrix factorization result U, and V (User-Feature Matrix, and Item-Feature matrix) for the toy example for each optimization method, respectively.

Below is the data and sample code ).

In [ ]:
d = {'item0': [np.nan, 7, 6, 1, 1,5],
 'item1': [3, 6, 7, 2, np.nan, 4],
 'item2': [3, 7, np.nan, 2, 1,5],
 'item3': [1, 4, 4, 3, 2, 3],
 'item4': [1, 5, 3, 3, 3, 4],
 'item5': [np.nan, 4, 4, 4, 3, 2]}
df = pd.DataFrame(data=d).astype('float32')
X = np.array(df)
#
MF_GD = MatrixFactor(k = 2, opt_method = 'GD', learn_rt = 0.001, num_iter=200
U_GD, VT_GD, loss_hist_GD = MF_GD.fit(X)
In [ ]:
print(U_GD, "\nuser-feature matrix using GD")
print(np.transpose(U_GD), "\nitem-feature matrix using GD")
[[0.79494786 0.78902942]
 [1.62009744 2.42686695]
 [1.87025306 1.85810683]
 [0.71288845 0.98068888]
 [1.12185167 0.51928559]
 [0.83894803 1.90538961]]
user-feature matrix using GD
[[0.79494786 1.62009744 1.87025306 0.71288845 1.12185167 0.83894803]
 [0.78902942 2.42686695 1.85810683 0.98068888 0.51928559 1.90538961]]
item-feature matrix using GD
In [ ]: import matplotlib.cm as cm
import matplotlib.pyplot as plt
plt.scatter(range(len(loss_hist_GD)), loss_hist_GD, label='GD' )
plt.legend()
Out[ ]:<matplotlib.legend.Legend at 0x7f9cf2cc6bd0>

contact

Assignment Exmaple

Recent Case

Service Scope

C|C++|Java|Python|Matlab|Android|Jsp|Prolo
g|MIPS|Haskell|R|Linux|C#|PHP|SQL|.Net|Hand
oop|Processing|JS|Ruby|Scala|Rust|Data Mining|数据库|Oracle|Mysql|Sqlite|IOS|Data Mining|网络编程|多线程编程|Linux编程操作系统|计算机网络|留学生|编程|程序|代写|加急|个人代写|作业代写|Assignment

Wechat:maxxuezhang

wechat