将tf.keras.layers与Tensorflow低级API结合使用

问题描述 投票:3回答:1

我可以将tf.keras.layers与低水平张量流相结合吗?

代码不正确,但我想做类似的事情:创建占位符,稍后将提供数据(在tf.Session()中)并将数据提供给我的模型

X, Y = create_placeholders(n_x, n_y)

output = create_model('channels_last')(X)

cost = compute_cost(output, Y)
tensorflow keras keras-layer
1个回答
3
投票

是的,它与使用tf.layers.dense()相同。使用tf.keras.layers.Dense()实际上是最新张量流版本1.13tf.layers.dense()已被删除)的首选方式。例如


import tensorflow as tf
import numpy as np

x_train = np.array([[-1.551, -1.469], [1.022, 1.664]], dtype=np.float32)
y_train = np.array([1, 0], dtype=int)

x = tf.placeholder(tf.float32, shape=[None, 2])
y = tf.placeholder(tf.int32, shape=[None])

with tf.name_scope('network'):
    layer1 = tf.keras.layers.Dense(2, input_shape=(2, ))
    layer2 = tf.keras.layers.Dense(2, input_shape=(2, ))
    fc1 = layer1(x)
    logits = layer2(fc1)

with tf.name_scope('loss'):
    xentropy = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=logits)
    loss_fn = tf.reduce_mean(xentropy)

with tf.name_scope('optimizer'):
    optimizer = tf.train.GradientDescentOptimizer(0.01)
    train_op = optimizer.minimize(loss_fn)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    loss_val = sess.run(loss_fn, feed_dict={x:x_train, y:y_train})
    _ = sess.run(train_op, feed_dict={x:x_train, y:y_train})
© www.soinside.com 2019 - 2024. All rights reserved.