如何解决“类型错误:无法序列‘_io.BufferedReader’对象”错误尝试多进程时

问题描述 投票:1回答:1

我想在我的代码切换线程来多来衡量其性能与我的计划是为了暴力破解密码保护的.zip文件有望取得较好的暴力破解的潜力。但每当我尝试运行该程序,我得到这样的:

BruteZIP2.py -z "Generic ZIP.zip" -f  Worm.txt
Traceback (most recent call last):
  File "C:\Users\User\Documents\Jetbrains\PyCharm\BruteZIP\BruteZIP2.py", line 40, in <module>
    main(args.zip, args.file)
  File "C:\Users\User\Documents\Jetbrains\PyCharm\BruteZIP\BruteZIP2.py", line 34, in main
    p.start()
  File "C:\Users\User\AppData\Local\Programs\Python\Python37\lib\multiprocessing\process.py", line 112, in start
self._popen = self._Popen(self)
  File "C:\Users\User\AppData\Local\Programs\Python\Python37\lib\multiprocessing\context.py", line 223, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Users\User\AppData\Local\Programs\Python\Python37\lib\multiprocessing\context.py", line 322, in _Popen
return Popen(process_obj)
  File "C:\Users\User\AppData\Local\Programs\Python\Python37\lib\multiprocessing\popen_spawn_win32.py", line 65, in __init__
reduction.dump(process_obj, to_child)
  File "C:\Users\User\AppData\Local\Programs\Python\Python37\lib\multiprocessing\reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
TypeError: cannot serialize '_io.BufferedReader' object

我也发现了同样的问题,因为我没有,但他们都没有答案/未解线程。我也试过上述Pool插入p.start(),因为我相信这是由于这样的事实,我是一个基于Windows的计算机上,但它并没有帮助造成的。我的代码如下:

  import argparse
  from multiprocessing import Process
  import zipfile

  parser = argparse.ArgumentParser(description="Unzips a password protected .zip by performing a brute-force attack using either a word list, password list or a dictionary.", usage="BruteZIP.py -z zip.zip -f file.txt")
  # Creates -z arg
  parser.add_argument("-z", "--zip", metavar="", required=True, help="Location and the name of the .zip file.")
  # Creates -f arg
  parser.add_argument("-f", "--file", metavar="", required=True, help="Location and the name of the word list/password list/dictionary.")
  args = parser.parse_args()


  def extract_zip(zip_file, password):
      try:
          zip_file.extractall(pwd=password)
          print(f"[+] Password for the .zip: {password.decode('utf-8')} \n")
      except:
          # If a password fails, it moves to the next password without notifying the user. If all passwords fail, it will print nothing in the command prompt.
          print(f"Incorrect password: {password.decode('utf-8')}")
          # pass


  def main(zip, file):
      if (zip == None) | (file == None):
          # If the args are not used, it displays how to use them to the user.
          print(parser.usage)
          exit(0)
      zip_file = zipfile.ZipFile(zip)
      # Opens the word list/password list/dictionary in "read binary" mode.
      txt_file = open(file, "rb")
      for line in txt_file:
          password = line.strip()
          p = Process(target=extract_zip, args=(zip_file, password))
          p.start()
          p.join()


  if __name__ == '__main__':
      # BruteZIP.py -z zip.zip -f file.txt.
      main(args.zip, args.file)

正如我以前说过,我相信这是发生主要是因为我是一个基于Windows的计算机上现在。我分享我的代码与其他几个人谁是基于Linux的机器,他们不得不运行上面的代码没有问题。

我在这里的主要目标是拿到8个进程/池开始最大化做的尝试相比,线程的数量,但由于这样的事实,我不能得到TypeError: cannot serialize '_io.BufferedReader' object消息的修复我在这里做什么不清楚,我怎么能去来修复它。任何援助将不胜感激。

python python-3.x windows multiprocessing pool
1个回答
3
投票

文件句柄不序列化非常好......但你可以发送压缩文件,而不是将zip文件句柄(字符串序列化进程之间没关系)的名称。并避免zip为您的文件名,因为它是一个内置。我选择zip_filename

p = Process(target=extract_zip, args=(zip_filename, password))

然后:

def extract_zip(zip_filename, password):
      try:
          zip_file = zipfile.ZipFile(zip_filename)
          zip_file.extractall(pwd=password)

另一个问题是,你的代码不会因为这个并行运行:

      p.start()
      p.join()

p.join等待进程结束......几乎没有用处。您可以选择存储过程标识符join他们到底。

这可能会导致其他问题:以并行方式创建太多进程可能是你的机器的问题和之后的某一时刻会帮助不大。考虑一个multiprocessing.Pool相反,限制工人的数量。

简单的例子是:

with multiprocessing.Pool(5) as p:
    print(p.map(f, [1, 2, 3, 4, 5, 6, 7]))

适合您的例子:

with multiprocessing.Pool(5) as p:
    p.starmap(extract_zip, [(zip_filename,line.strip()) for line in txt_file])

starmap扩展元组作为两个独立的参数,以适应你extract_zip方法,如在Python multiprocessing pool.map for multiple arguments解释)

© www.soinside.com 2019 - 2024. All rights reserved.