I don't know how loading times in games work (what effects them etc). But can Frostbite have something to do with it? I did all three stars in Arcade last night and I think the loading time probably were like 20-30% of my total spent time heh.
Generally there are three phases to loading:
1) File I/O; this is raw throughput of reading data out of storage and into memory. This can be affected by poor data layout contiguity which increases seek times for the read head vs the actual time spent reading data, poor read buffer size choice, etc. It can also be caused by background processes also performing reads or writes during the load which causes the OS scheduling algorithm to try to balance throughput needs.
2) Decompression throughput; this is the time spent on either dedicated hardware unit or the CPU to decompress the data read in via the storage subsystems. On both consoles there is hardware accelerated support for Zlib decompression, but some folks opt for higher ratio software solutions like LZMA, Oodle Kraken, in lieu or in addition. Disk space concerns are huge right now in this generation. Decompression time is usually smaller than I/O, though, so they pipeline well, so it's not aggregate time for (1) + (2).
3) Serialization; this is any operation that the process needs to perform on the data in order for it to be ready for use at runtime/main loop. For most immutable assets like textures this is nothing, but for game objects this can involve unpacking structures that need to be memory ready and fixing up pointers, running load-time subsystems that might do some sort of additional spatial partitioning, priming the pump on AI related state machines and pathing setups, etc. High variability here.
If I had to guess based on my knowledge of DICE's engineering I'd say some sort of just-in-time scene graph/visibility/etc. processing in (3) is what's problematic. But it's just a low percentage guess.