Search results
Jump to navigation
Jump to search
- * **激活函数 (Activation Functions):** 引入非线性,使模型能够学习复杂的模式。例如 [[Sigm * **损失函数 (Loss Functions):** 衡量模型预测值与真实值之间的差异。例如 [[均方误� ...8 KB (316 words) - 02:51, 12 May 2025
- [[Category:激活函数 (Category:Activation functions)]] ...8 KB (121 words) - 11:58, 11 May 2025
- * **损失函数 (Loss Functions):** 损失函数用于衡量模型的预测结果与真实结果之间的� model.add(Dense(16, input_dim=10, activation='relu')) ...9 KB (254 words) - 00:43, 27 March 2025
- [[Category:激活函数 (Activation functions)]] ...8 KB (227 words) - 04:34, 9 May 2025
- * **激活函数 (Activation Functions):** [[激活函数]] 用于引入非线性。 * **损失函数 (Loss Functions):** [[损失函数]] 用于衡量模型的预测误差。 ...8 KB (197 words) - 04:24, 7 May 2025
- [[Category:激活函数 (Activation functions)]] ...8 KB (167 words) - 10:15, 3 May 2025
- * **激活函数 (Activation Functions)**:引入非线性,使神经网络能够学习更复杂的模式。常� * **损失函数 (Loss Functions)**:衡量模型预测结果与真实结果之间的差距。常用的损� ...10 KB (191 words) - 06:42, 7 May 2025
- * '''激活函数''' (Activation Functions): 一个数学函数,用于决定神经元的输出。常见的激活� [[Category:人工智能]] ...10 KB (180 words) - 04:04, 18 May 2025
- | 激活函数 (Activation Functions) || 引入非线性,增强模型表达能力 || ReLU, Sigmoid, Tanh, [[� | 损失函数 (Loss Functions) || 衡量模型预测与真实值之间的差异 || 均方误差 (MSE), � ...9 KB (271 words) - 13:31, 27 March 2025
- * '''激活函数 (Activation Functions):''' VGGNet 使用 [[ReLU]] (Rectified Linear Unit) 作为激活函数。Re [[Category:卷积神经网络 (Convolutional Neural Networks)]] [[技术分析]] [[数 ...10 KB (264 words) - 16:48, 12 May 2025
- * **激活函数 (Activation Functions)**: 激活函数引入非线性因素,使得网络能够学习复杂的� [[Category:深度学习模型]] ...10 KB (139 words) - 11:25, 12 May 2025