Hello,
Myself and a friend of mine both upgraded to FIOS gigabit internet and in both households we are getting different results and I can’t nail down why. All mentioned below are wired.
MY HOUSE
i have a new network from top to bottom, all Cat6, X10 Netgear router, server rack, 2x24 port switches and a lot of hardware running over my network. This thing smokes for speed on my Dell Latitude E6440 and MacbookPro. I get 900/900 or better all the time.
However, on older laptops with “supposed” gigabit cards (Lenovo x200 and Macbook4,1) I get wildly inconsistent results.
Lenovo x200: 180/297
Macbook: 197/898
The DL on both is terrible and the UL is oddly low and high.
It seems to me that either the cards in the older laptops just can’t handle the speeds they advertise or there is some setting I’m missing.
Macbook is running CloudReady and Lenovo is running Linux Mint MATE.
My E6440 is also running MATE and MacBook Pro is running Mojave.
FRIENDS HOUSE:
My E6440 Dell I get 900/800 on his network but his Dell tower (9 years old) he gets 125/215. His tower supposedly has a gigabit card, but never gets even close to gigabit speeds.
Yea we’ve checked all the cables and they are 5e or better in his house, mostly 6.
He’s running Windows 10.
QUESTIONS
- why would I get wildly different speeds on supposed gigabit hardware? Is there some setting I am missing? Or is it just not possible on older “gigabit” hardware?
- why would DL be so consistently low and UL considerably higher? This makes no sense.
Any thoughts here on what we might check is appreciated.