Dear all (esp. Mike),
I run gTest (https://github.com/goodmami/gtest) for a coverage test against a large skeleton file (item file) which has 13 millions of lines/sentences in Indonesian. My computer worked very hard and became unresponsive (I could not do other things with my computer).
I (with the help of a friend) tried to split this large file into 13,000 files (each file has 1,000 lines/sentences) and used all CPUs in my machine (parallel processing) to run gTest but again, my computer became unresponsive (maybe I need to wait for hours and while waiting, I cannot do other things with my computer).
Is there a better way to do this?
Last time when I run gTest for JATI skeleton (it has 2,003 lines/sentences), without splitting the file / parallel processing, it took around 4 hours…