标签:opp ret iter logs 读取 操作 iterator log except
对于超大规模的csv文件,我们无法一下将其读入内存当中,只能分块一部分一部分的进行读取;
首先进行如下操作:
import pandas as pd
reader = pd.read_csv(‘data/servicelogs‘, iterator=True)
分块,每一块是一个chunk,之后将chunk进行拼接;
loop = True
chunkSize = 100000
chunks = []
while loop:
try:
chunk = reader.get_chunk(chunkSize)
chunks.append(chunk)
except StopIteration:
loop = False
print "Iteration is stopped."
df = pd.concat(chunks, ignore_index=True)
标签:opp ret iter logs 读取 操作 iterator log except
原文地址:https://www.cnblogs.com/geeksongs/p/11072442.html