cancel
Showing results for 
Search instead for 
Did you mean: 

Rampage VI Extreme (LGA2066, Intel X299) - info, experience, BIOSes etc.

FlanK3r
Level 13
Intel public HEDT CPus with platfrom Intel X299 at Computex 2017. The CPU reviews are still under embargo for short time ,-). But some informations are officially out and OK with Intels NDA.

Intel X299 is really highend chipset for enthusiast CPUs. This year very powerfull, much more than anyone hoped. Because this year is AMD also very strong at CPU side (Announced not only 8 cores with SMT, but also up to 16C with SMT for AMD X399), Intel will launch step by step 6C/12T, 8C/16T, 10C/20T....Everything?:) Nooo, continue it with 12C/24T, 14C/28T, 16C/32T and new flagship 18C/36T !!!


New LGA2066 will be new one after 2011v3...Great value is, you can put in two generation CPUs. More info bellow.
1) Kabylake-X with new CPUs Core i5 X a Core i7 X. Those CPUs are KabyLake-X. Its basically Kabylake with more capacitors and bigger heatspreader. This could help with higher overclocking than classic Kabylakes. Kabylake-X have support only for dualchannel mode (up to 2666 MHz 1.2V). After XMP mode or manualy tweaking you can except everything between 3600 MHz to 4400 MHz at DRAM effective frequency.
2)Skylake-X, are HEDT processors Core i7 X and new Core i9 X. Starting as 6C/12T and up to crazy 18C/36T
There is support for Quadchannel memory. Based at first results on web, the memory clock we can expect overclocking of RAM around 3200 to 3800 MHz. All depends on type of memory chips, quality of IMC particular piece of CPU.

APEX series replaced Extreme series in extreme overclocking segment (yes, all fans of DICE, LN2 and LHe are focus directly at this board). This board broked many WRs after first day 🙂 There is example with informations about records from 31.5.2017.


APEX series replaced Extreme series in extreme overclocking segment (yes, all fans of DICE, LN2 and LHe are focus directly at this board). This board broked many WRs after first day 🙂 There is exmaple with informations about records from 31.5.2017. Rampage Extreme is for enthusiast, wattercooling setups, casemodders etc. Strix series is ideal part for daily overlcocking (of course, it can handle LN2 too !) and gamers, streamers...

Rampage VI Extreme - eATX size

-looks awesome, the rainbow AURA effects! But there is also small display for current information about CPU clock, temperatures or speed fans...

The motherboard support again up to 128 GB DDR4 DRAM in up to quadchannel (depends at your CPU - if KB-X or SK-X). In right upper corner are helpfull buttons START, RESET, PCIe and DIMM switchs. Also switch for slow mode, retry and safe button, RGB header and also great ROG DIMM.2 slot for NVMe M2 discs. So Extreme can be realized with Liquid Nitrogen also, if is it your hobby sometimes 😛 Look at crazy numbers of voltage meassuring points.

At the bellow are button to swicth the BIOS (two BIOSes here), many USB ports, MEM OK, again RGB header. Under frontplate near the PCH is place for next M.2



-part of IO. The IO shield is integrated and from left to right there are CLR CMOS button, BIOS Flashbakc button, Wi-Fi+BT device with support 802.11ad standard! Many USB3/3.1, LAN, audio outputs with backlight


-Rampage in the glory 🙂

And last video with short description from GamersNexus
Who knows me, knows me ;)....AMD 3000+, AMD x2 4600+ EE, AMD X4 955 BE C2,2x AMD X4 965 BE C3, AMD X4 970 BE C3, AMD x4 975 BE, AMD x4 980 BE, AMD X6 1090T BE, AMD x6 1100T BE, 2x AMD FX-8120, 2x AMD FX-8150, FX-6300, FX-8300, FX-8320E, FX-8320, FX-8350, FX-8370, FX-8370E, FX-9370, FX-9590, AMD A8-3850, AMD A8-3870K, A8-5600K, A10-5800K, A10-6800K, A10-7850K, A10-7870K, A 5150, Athlon x4 860K, Intel i7-5960X, i7-6700K, Intel i7-4770K, Intel i7-980x, Intel i7 2600k, Intel i7-3770K, i7-3930K.
341 Views
1,938 REPLIES 1,938

DragonPurr wrote:
Yep, surgical hemostats have a multitude of uses in electronics repair. Another indispensable tool is a magnetic ratcheting screwdriver. And here in the U.S., the very best magnetic ratcheting screwdriver is made by Snap-On Tools. I have both their shorter and longer ratcheting screwdriver. Their magnets are strong, they can use various other 100-piece or 200-piece bit driver sets, they are not cheap, but they last forever:

https://store.snapon.com/Standard-Handle-8-3-4-Ratcheting-Standard-Screwdriver-P634146.aspx
https://store.snapon.com/Standard-Handle-12-15-16-Ratcheting-Magnetic-Long-Orange-Screwdriver-P63414...

So here is a very bizarre, but very true, story about using surgical hemostats for PC repair. Yes, it's off-topic, but I think it's funny...

During my second year of full-time study at UT-Austin during 1983, IBM starting interviewing on-campus for part-time positions as electronics technicians at their PC manufacturing plant that they were rapidly expanding at their big IBM campus in north Austin. I had purple-dyed hair and dragon tattoos at the time, I knew that IBM had an ultra-conservative culture, but I applied anyway. I was offered an electronics technician job and started working at IBM 20 to 30 hours every week, and the pay was excellent. I did not receive IBM's full employee benefits such as medical insurance, but I was eligible for their very generous 50%-off employee discount to buy IBM PCs.

IBM used high-speed state-of-the-art robotics to assemble the mobos used on the original PC, PC/XT, and (later in 1984) PC/AT. That was followed by two technician stations where some components such as expansion slot sockets were hand-assembled, followed by a QC technician station that looked for mobo defects using high-quality magnifying lamps. The mobos were then passed to me at the final QA station before being sent to the wave-soldering machines. We had very high quality standards, there were never any production quotas, and each assembly and inspection station took as long as needed to ensure maximum mobo quality. At the same time that I was helping IBM build PCs as a UT-Austin student, Michael Dell was building PCs in his UT-Austin dorm room and selling them as his "PC's Limited" small business.

Anyhoo, a UT-Austin co-ed was seated at the QA station next to me. I noticed that she was using a straight hemostat to help with the repairing and adjusting of components, straightening bent IC chip pins, etc. She was actually using the hemostat as a roach clip for her, ummm, recreational herbs, and she brought it into work to help with fixing mobo components. So I asked my IBM manager if they could buy hemostats for every QC and QA technician to help with fixing and building PC mobos.

All the IBM managers dressed very conservatively, wearing heavily starched white shirts, and navy blue tie and slacks. But the UT-Austin students that they hired sometimes had tattoos and dyed hair. It was the heyday of the punk and new-wave '80s, after all. Many of us students were a very close-knit group at the IBM plant, and we would go club-hopping and listen to live music after work on Friday nights. On the following two school years, I was working part-time at IBM's development office at the same Austin campus, doing systems programming in C code and 8088 assembly language.

So that was how IBM supplied all their PC technicians with hemostats to assist with PC repair - all because a UT-Austin co-ed started using her hemostat roach clip at her QA station, and I asked my manager to buy hemostats for all of us to use. LOL!! True story!


I had never thought to use hemostats for pc building. I don't have the extensive pc history that many of you have, but I was going to the junkyard to pull heater cores out of wrecked cars back in the late 90's for watercooling and have been building ever since. I can't count the times those hemostats would have come in handy over the years. Or a nice set of screwdrivers. But wow, those are spendy! I'll have to look if you can get a nice set of bits in a good case. The one I bought at Lowes doesn't have a good case to hold the bits.

I missed what chip most of you plan to use in the board. Isn't the full line of i9's out next week? The 7920x from Silicon Lottery looks great, but the costs just get so high. Would be super cool to have 12 or more cores running above 4.5ghz. But I'll have to wait to see what the coffee lake/z370 boards are like before I decide. But sure enjoy watching this thread!:cool:

DashTrash wrote:
I had never thought to use hemostats for pc building. I don't have the extensive pc history that many of you have, but I was going to the junkyard to pull heater cores out of wrecked cars back in the late 90's for watercooling and have been building ever since. I can't count the times those hemostats would have come in handy over the years. Or a nice set of screwdrivers. But wow, those are spendy! I'll have to look if you can get a nice set of bits in a good case. The one I bought at Lowes doesn't have a good case to hold the bits.

I missed what chip most of you plan to use in the board. Isn't the full line of i9's out next week? The 7920x from Silicon Lottery looks great, but the costs just get so high. Would be super cool to have 12 or more cores running above 4.5ghz. But I'll have to wait to see what the coffee lake/z370 boards are like before I decide. But sure enjoy watching this thread!:cool:


Wow.. 12 cores all running at 4.7Ghz 🙂 I got my normal off the shelf 7900x on its way. I'll play with it in my rig and see how high it can go on a normal AIO before I decide to have it delidded or not. 🙂
Panteks Enthoo Elite / Asus x299 Rampage VI Extreme / Intel I9-7900X / Corsair Dominator RGB 3200MHz

MSI GTX 1080 TI / 2x Intel 900p / Samsung 970 Pro 512GB

Samsung 850 PRO 512GB / Western Digital Gold 8TB HD

Corsair AX 1200i / Corsair Platinum K95 / Asus Chakram

Acer XB321HK 4k, IPS, G-sync Monitor / Water Cooled / Asus G571JT Laptop

Brighttail wrote:
Wow.. 12 cores all running at 4.7Ghz 🙂 I got my normal off the shelf 7900x on its way. I'll play with it in my rig and see how high it can go on a normal AIO before I decide to have it delidded or not. 🙂


I'd love to see some posts on what you get with your chip! I think I'd end up going with the 7820x from Silicon Lottery. More cores would be cool, but I don't need them. The clock speed it more important for me. That's why I think I'll have to wait and see what the 8700k can do before I decide. I'll also be interested to see how much better the R6E is on average over competing boards.

DashTrash wrote:
I had never thought to use hemostats for pc building. I don't have the extensive pc history that many of you have, but I was going to the junkyard to pull heater cores out of wrecked cars back in the late 90's for watercooling and have been building ever since. I can't count the times those hemostats would have come in handy over the years. Or a nice set of screwdrivers. But wow, those are spendy! I'll have to look if you can get a nice set of bits in a good case. The one I bought at Lowes doesn't have a good case to hold the bits.

I missed what chip most of you plan to use in the board. Isn't the full line of i9's out next week? The 7920x from Silicon Lottery looks great, but the costs just get so high. Would be super cool to have 12 or more cores running above 4.5ghz. But I'll have to wait to see what the coffee lake/z370 boards are like before I decide. But sure enjoy watching this thread!:cool:


I plan to use two 7900X and one 7940X CPUs for three R6E workstation builds. I already have two 7900X, just waiting on the 7940X and I pre-ordered three EK monoblocks. Back in June, I was thinking of doing two 7980XE workstation builds for 36 cores of processing power, but when I looked at the specs and considered my intended video and photo processing use, I will get far better CPU and I/O throughout with a 10+10+14 34-core 3-PC setup than with two 7980XE workstations. I look at it like a hotel with three washers and dryers. The 7940X will be the main washer/dryer. But when I have 2000 to 3000 high-res digital RAW photos and several dozen 4K videos to process, I use all three PCs connected through a KVM switch to load-balance and process in parallel. The 7980XE would give me bragging rights on benchmarks, but I am not using these on a test bench to run benchmarks every week. And just as you get rapidly diminishing returns if you use gaming software on more than 6 or 8 cores, the performance of all of the photo and video processing software out there tapers off as you go beyond 10 or 12 cores. And as I mentioned in a previous post, if you are mainly doing a gaming-only build, the 6-core 8700K will become the best gaming CPU.

When you compare the 12C 7920X, 14C 7940X, 16C 7960X, 18C 7980XE, the 7940X also hits the sweet spot for performance and value of the four HCC i9s. The 7940X has a higher base freq, a higher Max Turbo freq, and along with a higher T-Junction temperature tolerance that is 8 degrees-C higher than the 7980XE, I think it will also have the highest overclocking headroom and best single-core performance for a $1400 price, compared to $1700 and $2000 for the 14C and 18C:

https://ark.intel.com/compare/126240,126695,126697,126699

The "XE" in the 7980XE just gives you two more cores and a little more cache than the 7960X, but it has no unique features compared to the other i9 "X" CPUs.

The Snap-On Tools magnetic ratcheting screwdrivers are expensive, but I think they are worth it. Their strong magnet is housed in the screwdriver shaft, so they can use any standard hex-shaped bit driver set with non-magnetized bits and the bits become magnetized. Many of the cheaper magnetized screwdrivers do not use a strong magnet or have a flimsy ratcheting mechanism. I bought two shorter and two longer Snap-On ratcheting screwdrivers about 14 to 16 years ago, and they are still going strong while I use them for everything from building PCs, to building furniture, to working on my car engine or car interior, to doing household repairs. If you combine the Snap-On screwdriver with several diverse bit driver sets, you can modify PC cases, mobos, laptops, furniture, car engines, etc, etc.

Surgical hemostats are also extremely useful for PC building and other uses. Do what I do and get two straight hemostats and two curved hemostats. Sometimes I need to use all four hemostats at the same time. They can function like forceps or tweezers, and they can function like clamps. Straight and curved ceramic tweezers are also useful if you do lots of soldering. I use my ceramic tweezers for PC building, soldering of electronics, and soldering of stained glass projects.

I can thank my fellow UT student and IBM co-worker, Jennifer, for using her hemostat roach clip as part of her mobo repair work at IBM, which I then also used, and IBM bought them for all their PC techs, some of whom probably also used it for... herbal recreation. She carried her hemostat in her purse all the time. Another quick story about Jennifer, and AMD...

I originally had another person seated at the QA station next to me. One day, IBM moved some workers around, and Jennifer sat next to me. We sat at two of the six QA stations that provided the final detailed quality inspection before the mobos were put through wave-soldering. It turns out that Jennifer was one of several hundred workers that AMD laid off throughout the mid-1980s. AMD had a 17-year policy of no layoffs prior to their first layoffs. To avoid publicizing that they were laying off employees, AMD referred to it as the "firing" of several hundred employees. AMD also had an office just down the street from IBM in Austin, and IBM hired some of the engineers and technicians that AMD "fired". In talking with the former AMD workers that IBM hired, many were very angry at AMD for saying that they were being fired just so AMD could appear to continue their no-layoffs policy. But most of the media coverage still referred to AMD's actions correctly as a mass-layoff. Being fired has a huge stigma usually associated with poor performance or behavioral issues. Being laid off is often just associated with companies cutting costs and restructuring. Intel has their anti-competitive tactics, but AMD is also no angel, and AMD has had lawsuits thrown at them over the past three decades for making various false promises to customers. There were other pretty UT co-eds working in the IBM PC plant that I worked in, which was as big as a cavernous aircraft hangar, but I think Jennifer was one of the prettiest students that IBM hired, after AMD "fired" her. I told her that she looked like Rebecca DeMornay in the 1983 movie "Risky Business", not as a pick-up line since I knew she had a boyfriend and she knew I had a girlfriend and we would all go out for dinner and clubbing after work on Friday nights, but her face and physique really did look like DeMornay. I was seated six feet from her QA station and she smelled nice too 😉 I joked that IBM should put a "Do Not Disturb" sign at her QA station because guys were constantly coming up to her table to talk. LOL

DragonPurr wrote:
...Back in June, I was thinking of doing two 7980XE workstation builds for 36 cores of processing power, but when I looked at the specs and considered my intended video and photo processing use, I will get far better CPU and I/O throughout with a 10+10+14 34-core 3-PC setup than with two 7980XE workstations...


Interesting. I plan on getting a single 7980XE for video and raw photo processing. What do you see bottle necking an 18 core processor that would lead to better throughput with fewer cores across multiple machines? Is it disk I/O? Would 18 core saturate dual 960 SSD's in your workflow?

AlexPeterson wrote:
Interesting. I plan on getting a single 7980XE for video and raw photo processing. What do you see bottle necking an 18 core processor that would lead to better throughput with fewer cores across multiple machines? Is it disk I/O? Would 18 core saturate dual 960 SSD's in your workflow?


Video and photo software would not bottleneck 18 cores. But none of the software out there, including Adobe's Premiere Pro, Photoshop, and all of Adobe's other software, which tends to be more optimized for more cores than many other imaging packages, can make full use of 18 cores compared to if you process the same files using 14, 12, or maybe even 10 cores. The same goes for all other imaging software that I use, including Dxo OpticsPro with all the Prime noise reduction parameters turned on, Photomatix HDR processing, etc. The bulk of my I/O will happen on 32-GB of Ram Disk on all three builds that use 64-GB of RAM. So I take 2000 or 3000 RAW photos from a Canon 5D Mark IV along with 4K videos from a Canon C100 and I do "manual load balancing" by dividing up files onto one, two, or three PCs, depending on how many files I need to process, and that kind of parallel processing will run faster than just one or two 18-core builds, again because the software cannot make good use of 36 threads. Maybe that will change five years from now, but I also seriously doubt that all my various software packages will be that multi-thread-optimized five years from now.

Edited to add: Optimizing the utilization of 20, 28, or 36 threads requires careful software development practices. And as you add more threading, you often increase the possibility of race conditions, concurrency issues, deadlocks, and livelocks, all of which can cause problems in the software. From a software development perspective, deadlocks are easier to debug if you can get a stack trace while the software is deadlocked. Livelocks are more difficult. And race conditions can be extremely difficult and time-consuming to debug and fix. Two-thirds of my work experience has been in high-performance computing environments in scientific applications, including supercomputing environments. For example, in processing terabytes of seismic data, we have to create lots of threads to try to reduce processing times that can take several days of non-stop computing. But for consumer software, companies tend to look at effort-to-reward ratio, and thus they mostly optimize for quad-core CPUs for gaming, and 8 to 10 cores for video and photo software, because if Adobe tries to optimize their software for 36 threads, the risks of adding more hard-to-find multi-threading bugs also increases. So the 7980XE will still be faster than one 7900X, both on benchmarks and on actual applications. But with most consumer software out there, two 7900X CPUs doing parallel processing will be faster than one 7980XE. I am very sure that someone will build a 7980XE gaming-only PC with 128-GB RAM for bragging rights. But then someone will come along and build a Z370/8700K gaming rig with 16-GB RAM that outperforms the much more expensive 7980XE gaming rig, assuming both builds use the same one or two GPUs, again because gaming software cannot make use of tons of threads and RAM.

DragonPurr wrote:
...a secondary heatsink underneath the I/O cover is ENTIRELY for the benefit of distributing the VRM's heat. The 10G does not need to be heatpiped to another heatsink and it can easily cool with a smaller heatsink compared to that included under the I/O cover right now....


That's good to know. How would the heat from the 10G be dissipated from under the I/O cover? Is there airflow under the cover? Or is radiance from the cover and shield enough to cool the area?

AlexPeterson wrote:
That's good to know. How would the heat from the 10G be dissipated from under the I/O cover? Is there airflow under the cover? Or is radiance from the cover and shield enough to cool the area?


The I/O cover is not totally enclosed. Along the upper-right edge of the I/O cover, where the cover meets the mobo, there is a 2 1/4 inch long and 3/8 inch high opening where heat radiates out from the existing heatsink. The existing I/O cover heatsink has to release heat from both the 10G chip and from VRM heat that is heatpiped over to the I/O heatsink, which is more heat than just what the 10G chip emanates. So now with the EK monoblock handling the VRM heat, whatever heatsink is used to cool the 10G chip (hopefully provided by EK) has more than enough opportunity to radiate heat out through that I/O cover slot opening. There is no fan inside the R6E's I/O cover to provide active convective cooling, so even with the original I/O heatsink needing to passively release both the 10G chip and VRM heat, that slot opening provides enough upward dissipation of heat. And so the same slot opening is big enough to release heat just from the 10G chip alone.

If you look at the R6E photos:

https://www.asus.com/us/Motherboards/ROG-RAMPAGE-VI-EXTREME/gallery/

you can see the I/O cover's opening that starts from the I/O cover's upper-right corner (where the heatpipe enters the I/O cover) and extends down to the halfway point of the DIMM slots to the right of the I/O cover's heatsink opening. That is where the current I/O heatsink is releasing its heat from. The I/O cover's armor is plastic, so it does not help in releasing heat by radiating heat outward.

I will post some photos of my disassembled R6E later and it will be even more clear 🙂

red773
Level 7
67548
I just got my Rampage today, and I noticed that two of the left ram slots where open out of the box. Does this mean that the board was manually tested before being packaged or do all boards come like this?

red773 wrote:
67548
I just got my Rampage today, and I noticed that two of the left ram slots where open out of the box. Does this mean that the board was manually tested before being packaged or do all boards come like this?


I wouldn't worry about DIMM slots or PCIe slots being in their open position. That can happen just through regular handling. And Asus likely went back and modified all their previously-manufactured mobos to install the redesigned VRM heatsinks.