帝王谷资源网 Design By www.wdxyy.com

更新:

感谢评论区提供的方案。

采用model.summary(),model.get_config()和for循环均可获得Keras的层名。

示例如下图

给keras层命名,并提取中间层输出值,保存到文档的实例

对于keras特定层的命名,只需在层内添加 name 即可

model.add(Activation('softmax',name='dense_1') )  # 注意 name 要放于函数内

#提取中间层
from keras.models import Model
import keras
layer_name = 'dense_1' #获取层的名称
intermediate_layer_model = Model(inputs=model.input, 
         outputs=model.get_layer(layer_name).output)#创建的新模型
intermediate_output = intermediate_layer_model.predict(X_test)

doc = open(r'C://Users//CCUT04//Desktop//1.txt','w')
for i in intermediate_output:
 print(i)
 print(i , file = doc)
doc.close()

补充知识:关于用keras提取NN中间layer输出

Build model...
__________________________________________________________________________________________________
Layer (type)     Output Shape   Param #  Connected to      
==================================================================================================
main_input (InputLayer)   (None, 89, 39)  0           
__________________________________________________________________________________________________
cropping1d_1 (Cropping1D)  (None, 85, 39)  0   main_input[0][0]     
__________________________________________________________________________________________________
cropping1d_2 (Cropping1D)  (None, 85, 39)  0   main_input[0][0]     
__________________________________________________________________________________________________
cropping1d_3 (Cropping1D)  (None, 85, 39)  0   main_input[0][0]     
__________________________________________________________________________________________________
cropping1d_4 (Cropping1D)  (None, 85, 39)  0   main_input[0][0]     
__________________________________________________________________________________________________
cropping1d_5 (Cropping1D)  (None, 85, 39)  0   main_input[0][0]     
__________________________________________________________________________________________________
concatenate_1 (Concatenate)  (None, 85, 195)  0   cropping1d_1[0][0]    
                 cropping1d_2[0][0]    
                 cropping1d_3[0][0]    
                 cropping1d_4[0][0]    
                 cropping1d_5[0][0]    
__________________________________________________________________________________________________
fc1 (BatchNormalization)  (None, 85, 195)  780   concatenate_1[0][0]    
__________________________________________________________________________________________________
fc2 (Bidirectional)    (None, 85, 2048)  9994240  fc1[0][0]      
__________________________________________________________________________________________________
fc3 (BatchNormalization)  (None, 85, 2048)  8192  fc2[0][0]      
__________________________________________________________________________________________________
global_average_pooling1d_1 (Glo (None, 2048)   0   fc3[0][0]      
__________________________________________________________________________________________________
main_output (Dense)    (None, 2)   4098  global_average_pooling1d_1[0][0] 
==================================================================================================
Total params: 10,007,310
Trainable params: 10,002,824
Non-trainable params: 4,486
__________________________________________________________________________________________________

假设我网络层数是上面这个结构.

如果我想得到pooling的输出, keras上有两张方法。

intermediate_layer_model = Model(inputs=model.input,outputs=model.get_layer(str('global_average_pooling1d_1')).output)
#model.summary()
#model.get_layer(str('cropping1d_1'))
intermediate_output = intermediate_layer_model.predict(data)

data是你的输入所用的数据....

from keras import backend as K
get_11rd_layer_output = K.function([model.layers[0].input],
         [model.layers[10].output])
layer_output = get_11rd_layer_output([data])[0]

我这里第10层是Pooling层.

这两个代码的output是一样的..

一般我看人用的都是第二个...

以上这篇给keras层命名,并提取中间层输出值,保存到文档的实例就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持。

标签:
keras,层命名,中间层,输出值

帝王谷资源网 Design By www.wdxyy.com
广告合作:本站广告合作请联系QQ:858582 申请时备注:广告合作(否则不回)
免责声明:本站文章均来自网站采集或用户投稿,网站不提供任何软件下载或自行开发的软件! 如有用户或公司发现本站内容信息存在侵权行为,请邮件告知! 858582#qq.com
帝王谷资源网 Design By www.wdxyy.com