-
Posts
10038 -
Joined
-
Last visited
-
Days Won
562
Content Type
Profiles
Articles, News and Tips
Forums
Everything posted by Hervé
-
I don't know the DW1397, only the DW1390 (which is supported OOB) and the DW1395 (which is supported by patched IO80211Family kext or BCM43xxFamily kext). Have a look at your System Preferences hardware list to check if the card is reported there. If it is, you'll see the PCI vendor id + device id which can be looked for in the kext to ascertain hardware support. You could also open the laptop to have a look at the car itself and see what chip is on it, what's written on the label.
-
The GMA950 kext will need to be patched to take your Q965/Q963 device id into account, you can't just copy it to /E/E and expect graphics acceleration + resolution to work. By default, it only supports "regular" GMA950. The Intel framebuffer will probably require the same. I'll have a look and get back to you.
-
Do tell us what you did to fix your issues, that'll help others. Re: wifi, can you tell us what card you have?
- 84 replies
-
- M5-581TG-6666 M5
- OSX Mountain lion
-
(and 8 more)
Tagged with:
-
If you mean T8100, yes, I believe that'll fit.
-
GMA950 is graphics controller of another chipset (GM945). According to PCI database website, device id 0x2992 is chipset Q965/Q963 which integrates graphics + memory controller. Q965 supports integrated graphics + PCIe x16 discrete add-on graphics card, Q963 only integrated graphics, so look at your motherboard for a PCIe x16 expansion slot to ascertain whether you have Q965 or just Q963 chipset. If this chipset were unsupported by Mac OS X, your only hope would be to have the Q965 which would allow you to add a supported PCIe x16 graphics card. Should it be Q963, you'd be screwed! You can probably force max resolution through the Chameleon boot plist in /Extra: tick the Graphics Mode box and select the maximum resolution supported by your display. Try and select Graphics Enabler too (if that proved problematic to boot, you can then manually use boot option 'GraphicsEnabler=No' afterwards then tick that option off). If that works, then there's a good chance we can find the correct FrameBuffer. If the onboard Q965/Q963 video is indeed GMA3000, then I understand that GMA950 kexts would support it (GMA3000 is a direct evolution of GMA900/950). I reckon it will require to patch the kext, but it should not be too complicated now that we have the PCI device id.
-
Copy IOATAFamily kext to /E/E of your USB installer and re-run myFix. Do a quick search on the forum, this issue has been mentioned several times these last few days. I uploaded the kext too.
-
That file can normally only work on Intel GMA950 graphics controller. Are you using an add-on PCIe graphics card or the integrated GMA3000 controller? Try and give us the PCI vendor id/device id for your video controller: Apple Menu -> About this Mac -> More Info -> System Report -> Equipment/Hardware -> PCI and/or Video cards/monitors.
-
It is your CPU, you can take my word for it. If any point had to be even stronger: just ran Xbench Quartz test on my D630 T7500 GMA X3100 under ML 10.8.2. There's no GPU support, no graphics acceleration at all, yet the test scored 128!!! Just get a T7500 or better (T8xxx, T9xxx with FSB800) and then you'll see a real difference. The T7100 appears just too low-spec for the D630 laptop.
-
I'd simply upgrade the CPU if I were you, that'll be a lot quicker and probably much cheaper to do... Get a faster C2D with bigger cache. To prove my point: a. D620, C2D T7200 2.0GHz FSB667, 2GB DDR2-667, 320Go HDD 7200rpm, nVidia NVS 110m with 1440x900 LCD, Lion 10.7.5 b. D630, C2D T7500 2.2GHz FSB800, 4GB DDR2-800, 80Go HDD 7200rpm, GMA X3100 with 1440x900 LCD, Lion 10.7.5 c. D630, C2D T7500 2.2GHz FSB800, 4GB DDR2-800, 160Go SSD, nVidia 135m with 1440x900 LCD, ML 10.8.2 -> Xbench Quartz Graphic tests on all 3 running PState on-demand, no app running and nothing opened on desktop a. 1) 89.94 a. 2) 90.29, a few minutes later a. 3) 106.65, a few more minutes later a. 4) 100.35, even more minutes later b. 1) 81.78 b. 2) 65.43, a few minutes later b. 3) 71.03, a few more minutes later b. 4) 68.84, even more minutes later c. 1) 43.38 (ok, I forgot to close utorrent in the background) c. 2) 212.39 a few minutes later (utorrent being closed) c. 3) 87.35 a few more minutes later c. 4) 74.97, even more minutes later c. 5) 216.22, immediately afterwards Then ran Quartz + OpenGL tests simultaneously on all 3: a. -> 64.09 b. -> 37.36 c. -> 71.47 So, go figure! These benchmarks don't mean much to me I'm afraid... All three laptops run fine, with D620 nVidia definitely a bit slower than the D630 X3100 under the same version of Lion. The D630 nVidia remains the best performer to me, although I don't really feel much of a difference between that one and its X3100 counterpart, the graphics rendering is however much nicer, a lot of that having to do with ML as well...
-
No, I have slightly higher values. But I have a different (better) CPU and I don't believe XBench actually evaluates the GPU. To me it only tests CPU. I believe your feeling of lower performance is simply due to having the low-end CPU of the T7xxx family. Your RAM might be slower in the D630 too (timings, not just bus speed). Also, don't make the mistake of thinking that an nVidia GPU should just be quicker/faster than the Intel integrated GMA X3100. It may be quicker, it may not be (I don't know the inner frequency of the nVidia 135m), but it definitely has more graphical capabilities than its Intel counterparts. That will be felt in graphics intensive application (games, video, CAD, all that kind of stuff), not really on basic/regular desktop windowing or browsing.
-
Yes, the boot flags and/or options that you manually specify at startup override the Chameleon boot plist. Watch with very recent versions of Chameleon as they would appear to memorize manual boot flags and options afterwards...
-
Oh, you need to get to know Intel CPUs a wee bit better... PState does show/display the correct set of frequencies for your T7100, don't get mistaken. T7xxx CPUs do operate from a range of 600MHz to 2.8GHz for the fastest of the range in steps of 200MHz (ie. steps of CPU bus speed). It may not show up in Linux, but it certainly would in Windows. Look up the Intel datasheet if you need details or want to get familiar with IDA (a kinda of turbo/boost mode) or LFM/SuperLFM modes (low frequencies). Basically, a T7xxx can run at speeds of variable FID multipler x CPU bus speed, the lowest SuperLFM multiplier being x3 and the IDA multipler being x12 -> this gives a range from 600MHz to 1.8-2.8GHz... That's EIST for you (Enhanced Intel SpeedStep). Now sometimes, the CPU FSB (which is quad-pumped Bus Speed) does not operate at the exact max. speed, but slightly below, so you could expect an 800MHz FSB (4 x 200MHz) CPU to actually run at 798MHz or 799MHz and, as such, it may report a frequency of say 595MHz instead of 600MHz or 2189MHz instead of 2200MHz. That's just perfectly normal. Use tools like RmClock and CPUID in Windows and you'll see.
-
Mmm, I don't understand most of all this, especially the part about the lowest frequencies "which don't really exist"... DId you check your BIOS settings and compare to those listed in dedicated thread? Also, can you post a screendump of your Cham boot plist (opened with Cham wizard) and tell us what SMBIOS settings you chose? You should also question the true meaning of benchmarking tools results: you're running graphics tests and the higher the CPU frequency, the better the results -> where's GPU involvement in there? I don't know how the benchmarking is done, but it seems to me that only CPU is tested and, should this be the case, it's not a true reflection of the computer's real graphics capabilities and/or performance. I'd suggest you run 2 separate tests of GeekBench in the same way as well: first with on-demand performance then with frequency set to highest. I'd be interested to hear which one is better... I guess that if you feel your laptop runs faster without Emulated SpeedStep, you should remove it.
-
Only the VGA port is supported, no DVI. Simply connect the monitor, it should be detected. If not, just go to the Monitor Pref Pane and click "Detect Monitors" button. Do not select port mirroring, it's not working properly and you'd get both screens going all garbled.
-
Ok, but that is not sustainable because that is safe mode. We need to identify the kext causing startup to fail and fix the issue.
- 84 replies
-
- M5-581TG-6666 M5
- OSX Mountain lion
-
(and 8 more)
Tagged with:
-
Follow the EDP documentation to the letter. That should get you through without hiccup.
-
We should be able to get past that issue by finding the right Chameleon option or flag. I can suggest to try various combination of the following boot options: PciRoot=1 USBBusFix=No/Yes npci=0x2000/0x3000 GraphicsEnabler=No/Yes Use them along -v flag to monitor startup messages. Did you go through the various Optiplex 745 threads that were created last year?
-
Now that you mention it, one of the D630 that I very rarely use does indeed have a faint but regular HDD click. It's a Seagate 160Go 7.2k rpm unit. I have Hitachi 80Go in others but I have not noticed anything that bothered me. I'm using an SSD in my daily Hack, so no noise at all...
-
??? I mustn't have the same system...
-
Please look-up the compatibility chart + bootpack table in EDP pages of the web site. These indicate what is supported and if it's not listed, it's not supported.
-
Oh, you really chose complication... Forget any decent graphics under ML on that model: there is no graphics support, there are no kexts at all, there is no acceleration, nada, ziltch. No, you can't add an nVidia GPU, you can only buy/get a D630 fitted with one at factory. No, there is no EDP support for ML on that model, hence why you can't run EDP on it under ML. Browsing is slow, video unwatchable, opening/minimising/closing windows is slow, there are video artifacts most of the time (minor and sometimes major), games unplayable, etc, etc. Graphics painfully run off the CPU and as such affect overall system performance. To me, it's just not sustainable to run unless you like to say you have ML on the system but not actually use it; I don't see the point, but maybe that's just me... Again, I can only recommend to stick to Lion. I've uploaded the complete necessary Extra folder. Use it as indicated if you're a non-believer, but please, no more requests for ML on that D630 X3100 system. It's not supported and does not run properly, plain and simple.
-
Read my answer in other topic re: reported 8400m GS card -> replace DSDT with version from boot pack will restore report of NVS 135m. I have exact same version/ROM/revision of nVidia NVS 135m as you:
-
Yes, many systems do that with ML... -> https://osxlatitude.com/index.php?/topic/1852-1082-update/ You use DDR2-400? That'll slow things down... Didn't even think it would be accepted by the laptop. You really should use DDR2-667 or DDR2-800.
-
If your top menu bar is translucid, you have graphics acceleration as expected. Try a green/purple desktop background to verify this, it'll be more visible than with a blue or grey background. The DSDT installed after EDP has been run does indeed change reported graphics card to 8400GS. If you re-instate the DSDT from the boot pack, you'll get NVS 135m reported back. It's one of those little things we have to fix in EDP, but it's trivial as only cosmetic. Now, ML does run a bit slower than Lion and quite slower than Snow Leopard on these machines. That's just how it goes, ML is much more demanding on resources. I'd say the minimum should be a T7500 with best performance with a T8xxx or T9300/T9500. DDR2-800 also helps. The Dock in SL was not transparent but acting like a mirror showing reflection of icons and windows/desktop background above it. That effect has been modified under ML, where it'll reflects icons a tad dimmer and windows above it only create a vague hint of a shadow. That's just OS X evolution...
-
Tjmax of both T7100 and T5600 is 100°C. You should therefore check the contents of the plist file of your /Extra/IntelCPUmonitor kext as the default Tjmax value is 0. You'll get incorrect (lower than real) T° readings with an incorrect Tjmax value in the kext. I have nothing specific to performance in terms of kexts in my /E/E folder, it's just the EDP-installed kexts, including TSCsync. So, if you have a myHack/EDP installation, you have the same as I do. I just have better T7500 CPUs (bigger cache, higher freq.).