张力板嵌入与“计算PCA”挂起

问题描述 投票:2回答:2

我试图在tensorboard中显示我的嵌入。当我打开tensorboard的embeddings选项卡时,我得到:“计算PCA ......”和tensorboard无限挂起。

在此之前,它确实加载了200x128的张量形状。它也找到了元数据文件。

我在TF版本0.12和1.1上尝试了相同的结果。

features = np.zeros(shape=(num_batches*batch_size, 128), dtype=float)
embedding_var = tf.Variable(features, name='feature_embedding')
config = projector.ProjectorConfig()
embedding = config.embeddings.add()
embedding.tensor_name = 'feature_embedding'
metadata_path = os.path.join(self.log_dir, 'metadata.tsv')
embedding.metadata_path = metadata_path

with tf.Session(config=self.config) as sess:
  tf.global_variables_initializer().run()
  restorer = tf.train.Saver()
  restorer.restore(sess, self.pretrained_model_path)

  with open(metadata_path, 'w') as f:

    for step in range(num_batches):
      batch_images, batch_labels = data.next()

        for label in batch_labels:
          f.write('%s\n' % label)

        feed_dict = {model.images: batch_images}
        features[step*batch_size : (step+1)*batch_size, :] = \ 
                    sess.run(model.features, feed_dict)

  sess.run(embedding_var.initializer)
  projector.visualize_embeddings(tf.summary.FileWriter(self.log_dir), config)
tensorflow tensorboard
2个回答
0
投票

我不知道上面的代码有什么问题,但是我用不同的方式重写了它(下面),并且它有效。区别在于embedding_var何时以及如何初始化。

我也做了a gist to copy-paste code from

# a numpy array for embeddings and a list for labels
features = np.zeros(shape=(num_batches*self.batch_size, 128), dtype=float)
labels   = []   


# compute embeddings batch by batch
with tf.Session(config=self.config) as sess:
  tf.global_variables_initializer().run()
  restorer = tf.train.Saver()
  restorer.restore(sess, self.pretrained_model)

  for step in range(num_batches):
    batch_images, batch_labels = data.next()

    labels += batch_labels

    feed_dict = {model.images: batch_images}                     
    features[step*self.batch_size : (step+1)*self.batch_size, :] = \
                sess.run(model.features, feed_dict)


# write labels
metadata_path = os.path.join(self.log_dir, 'metadata.tsv')
with open(metadata_path, 'w') as f:
  for label in labels:
    f.write('%s\n' % label)


# write embeddings
with tf.Session(config=self.config) as sess:

  config = projector.ProjectorConfig()
  embedding = config.embeddings.add()
  embedding.tensor_name = 'feature_embedding'
  embedding.metadata_path = metadata_path

  embedding_var = tf.Variable(features, name='feature_embedding')
  sess.run(embedding_var.initializer)
  projector.visualize_embeddings(tf.summary.FileWriter(self.log_dir), config)                  

  saver = tf.train.Saver({"feature_embedding": embedding_var})
  saver.save(sess, os.path.join(self.log_dir, 'model_features'))

0
投票

这是一个错误。它已在tensorflow 1.13中修复

© www.soinside.com 2019 - 2024. All rights reserved.