python写爬虫如何实现任务队列?

比如说我的爬虫要爬行http://xxx.com/page.php?id=1http://xxx.com/page.php?id=88的数据,但是由于机器限制我只能开十个线程,那么我怎么做到把这88个抓取任务分配给十个线程,并且以最快的速度完成任务?(Ps:我不要用框架,我想自己原生实现)

阅读 5.6k
2 个回答

至少三种方法及相应的参考实现:

from multiprocessing.dummy import Pool as ThreadPool


def worker(n):
    return n + 2

numbers = range(100)

pool = ThreadPool(processes=10)
result = pool.map(worker, numbers)
pool.close()
pool.join()
print(result)
import concurrent.futures


def worker(n):
    return n + 2

numbers = range(100)

with concurrent.futures.ThreadPoolExecutor(max_workers=10) as executor:
    result = executor.map(worker, numbers)
    print(list(result))
from collections import deque
import queue
import threading


def do_work(n):
    return n + 2


def worker():
    while True:
        item = q.get()
        if item is None:
            break
        result.append(do_work(item))
        q.task_done()

q = queue.Queue()
result = deque()
num_worker_threads = 10
threads = []
for i in range(num_worker_threads):
    t = threading.Thread(target=worker)
    t.start()
    threads.append(t)

for item in range(100):
    q.put(item)

# block until all tasks are done
q.join()

# stop workers
for i in range(num_worker_threads):
    q.put(None)
for t in threads:
    t.join()

print(result)

官方文档:

应该有好几种线程同步方案吧,比如线程池,互斥锁等
讲个我以前用过的大概思路,主线程声明一个堆栈,设置堆栈的最大堆叠数,10个线程就是10Queue.Queue(10)把堆栈作为变量传入子线程,子线程工作前,往堆栈里插入1个值Queue.put(1,timeout=30),完成工作后从堆栈里取出一个值Queue.get()设置timeout被阻塞的线程会等待30s后再次尝试插入值

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进
推荐问题