欧美一级a免费放视频,欧美一级a免费放视频_丰满年轻岳欲乱中文字幕电影_欧美成人性一区二区三区_av不卡网站,99久久精品产品给合免费视频,色综合黑人无码另类字幕,特级免费黄片,看黃色录像片,色色资源站无码AV网址,暖暖 免费 日本 在线播放,欧美com

合肥生活安徽新聞合肥交通合肥房產(chǎn)生活服務(wù)合肥教育合肥招聘合肥旅游文化藝術(shù)合肥美食合肥地圖合肥社保合肥醫(yī)院企業(yè)服務(wù)合肥法律

代寫159.740編程,、代做c/c++,Python程序
代寫159.740編程、代做c/c++,,Python程序

時(shí)間:2024-11-04  來源:合肥網(wǎng)hfw.cc  作者:hfw.cc 我要糾錯(cuò)



159.740 Intelligent Systems
Assignment #2 
N.H.Reyes 
Letter Recognition using Deep Neural Nets with Softmax Units 
Deadline: 4th of November 
Instructions: 
You are allowed to work in a group of 2 members for this assignment. 
Your task is to write a program that implements and tests a multi-layer feed-forward network for 
recognising characters defined in the UCI machine learning repository: 
http://archive.ics.uci.edu/ml/datasets/Letter+Recognition
Requirements: 
1. Use QT to develop your Neural Network application. A short tutorial on QT, and a start-up 
code that will help you get started quickly with the assignment is provided via Stream. 
2. You may utilise/consult codes available in books and websites provided that you cite them 
properly, explain the codes clearly, and incorporate them with the start-up codes provided. 
3. Implement a multi-layer feed-forward network using backpropagation learning and test it on the 
given problem domain using different network configurations and parameter settings. There 
should be at least 2 hidden layers in your neural network. 
h21 h11 X1
X2
F1
F2 h12 h22
OF1
OF2
δh21
δh22 δh12
δf1
δf2
δh11
… … … … 
X16
Fm Hi Hj
OFm
Input node
Legend: 
hidden node
output node = softmax unit
 Note that all nodes, except the input nodes have a bias node attached to it. 
159.740 Intelligent Systems
Assignment #2 
N.H.Reyes 
A. Inputs 
 16 primitive numerical attributes (statistical moments and edge counts) 
 The input values in the data set have been scaled to fit into a range of integer values 
from 0 through 15. It is up to you if you want to normalise the inputs before feeding 
them to your network. 
B. Data sets 
 Use the data set downloadable from: 
 Training set: use the first 16,000 
 Test set/Validation set: use the remaining 4,000 
 Submit your training data, validation/test data in separate files. 
C. Performance measure: 
 Mean Squared Error (MSE) 
 Percentage of Good Classification (PGC) 
 Confusion Matrix (only for the best Neural Network configuration found) 
D. Training 
 Provide a facility for shuffling data before feeding it to the network during training 
 Provide a facility for continuing network training after loading weights from file (do not 
reset the weights). 
 Provide a facility for training the network continuously until either the maximum 
epochs have been reached, or the target percentage of good classification has been met. 
 For each training epoch, record the Mean Squared Error and the Percentage of Good 
Classification in a text file. You need this to plot the results of training later, to 
compare the effects of the parameter settings and the architecture of your network. 
E. Testing the Network 
 Calculate the performance of the network on the Test set in terms of both the MSE and 
PGC. 
F. Network Architecture 
 It is up to you to determine the number of hidden layers and number of hidden nodes 
per hidden layer in your network. The minimum number of hidden layers is 2. 
 Use softmax units at the output layer 
 Experiment with ReLU and tanh as the activation functions of your hidden units 
 Determine the weight-update formulas based on the activation functions used 
4. Provide an interface in your program for testing the network using an input string consisting of 
the 16 attributes. The results should indicate the character classification, and the 26 actual 
numeric outputs of the network. (the start-up codes partly include this functionality already, for 
a simple 3-layer network (1 hidden layer), but you need to modify it to make it work for the 
multiple hidden layer architecture that you have designed). 
5. Provide an interface in your program for: 
A. Reading the entire data set 
B. Initialising the network 
C. Loading trained weights 
D. Saving trained weights 
E. Training the network up to a maximum number of epochs 
159.740 Intelligent Systems
Assignment #2 
F. Testing the network on a specified test set (from a file) 
G. Shuffling the training set. 
6. Set the default settings of the user interface (e.g. learning rate, weights, etc.) to the best 
configuration that delivered the best experiment results. 
7. Use a fixed random seed number (123) so that any randomisation can be replicated empirically. 
8. It is up to you to write the main program, and any classes or data structures that you may 
require. 
9. You may choose to use a momentum term or regularization term, as part of backpropagation 
learning. Indicate in your documentation, if you are using this technique. 
10. You need to modify the weight-update rules to reflect the correct derivatives of the activation 
function used in your network architecture. 
11. Provide graphs in Excel showing the network performance on training data and test data 
(similar to the graphs discussed in the lecture). 
12. Provide the specifications of your best trained network. Fill-up Excel workbook 
(best_network_configuration.xlsx). 
13. Provide a confusion matrix for the best NN classifier system found in your experiments. 
14. Provide a short user guide for your program. 
15. Fill-up the Excel file, named checklist.xlsx, to allow for accurate marking of your assignment. 
Criteria for marking 
 Documentation – 30% 
o Submit the trained weights of your best network (name it as best_weights.txt) 
o Generate a graph of the performance of your best performing network (MSE vs. 
Epochs) on the training set and test set. 
o Generate a confusion matrix of your best network 
o fill-up the Excel file, named checklist.xlsx
o fill-up the Excel file, named best_network_configuration.xlsx
o provide a short user guide for your program 
 System implementation – 70% 
Nothing follows. 
N.H.Reyes 

請(qǐng)加QQ:99515681  郵箱:[email protected]   WX:codinghelp





 

掃一掃在手機(jī)打開當(dāng)前頁(yè)
  • 上一篇:DATA 2100代寫,、代做Python語(yǔ)言編程
  • 下一篇:ME5701程序代寫、代做Matlab設(shè)計(jì)編程
  • ·代寫2530FNW,、代做Python程序語(yǔ)言
  • ·代寫CIS5200、代做Java/Python程序語(yǔ)言
  • ·LCSCI4207代做,、Java/Python程序代寫
  • ·代寫COP3502、Python程序設(shè)計(jì)代做
  • ·代做MLE 5217,、代寫Python程序設(shè)計(jì)
  • ·代寫ISAD1000,、代做Java/Python程序設(shè)計(jì)
  • ·代做COMP3811、C++/Python程序設(shè)計(jì)代寫
  • ·代寫SCIE1000,、代做Python程序設(shè)計(jì)
  • ·代寫comp2022,、代做c/c++,Python程序設(shè)計(jì)
  • ·CVEN9612代寫,、代做Java/Python程序設(shè)計(jì)
  • 合肥生活資訊

    合肥圖文信息
    出評(píng) 開團(tuán)工具
    出評(píng) 開團(tuán)工具
    挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
    挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
    戴納斯帝壁掛爐全國(guó)售后服務(wù)電話24小時(shí)官網(wǎng)400(全國(guó)服務(wù)熱線)
    戴納斯帝壁掛爐全國(guó)售后服務(wù)電話24小時(shí)官網(wǎng)
    菲斯曼壁掛爐全國(guó)統(tǒng)一400售后維修服務(wù)電話24小時(shí)服務(wù)熱線
    菲斯曼壁掛爐全國(guó)統(tǒng)一400售后維修服務(wù)電話2
    美的熱水器售后服務(wù)技術(shù)咨詢電話全國(guó)24小時(shí)客服熱線
    美的熱水器售后服務(wù)技術(shù)咨詢電話全國(guó)24小時(shí)
    海信羅馬假日洗衣機(jī)亮相AWE  復(fù)古美學(xué)與現(xiàn)代科技完美結(jié)合
    海信羅馬假日洗衣機(jī)亮相AWE 復(fù)古美學(xué)與現(xiàn)代
    合肥機(jī)場(chǎng)巴士4號(hào)線
    合肥機(jī)場(chǎng)巴士4號(hào)線
    合肥機(jī)場(chǎng)巴士3號(hào)線
    合肥機(jī)場(chǎng)巴士3號(hào)線
  • 上海廠房出租 短信驗(yàn)證碼 酒店vi設(shè)計(jì)