So, on these forums I have heard two versions:
1) Walt_K's above, the magic figure is 1-9% of the CL on one card on that card, the rest 0
2) daybreakgonesXe and others: The magic figure is 1-9% of TOTAL CL, one one card, the rest 0.
Certainly 2 creates issues as pointed out by Walt_K, and I asked about that yesterday (in the context of constantly increasing total CLs by apping, it gets harder to meet this goal on one card).
However 1 is strange as well. Presumably FICO is meant to measure something like how well I am doing with credit. If I have say $150K in CL, and I keep a very old store card with $100 CL, can I just put $1 on it to maximize score? This doesn't seem to reflect the total credit picture.
So, is there a consensus?
Thanks guys, I think I am understanding.. Just play around with it and see what drops me or raises me.. I am still too afraid to pay them all off 100%
Personally I do not worry about getting to 9% reporting. I just make sure it reports below that amount. Best bet is to only have 1 card report a small balance (less then 9% of that cards credit limit.)
Having 2 cards report will hurt your score. I had a second card report a 1.50 balance and it ding my score 15 points. I have well over 60K of open credit.
Do not use one card and attempt to hit 9% utl of all open credit limits. If you go above 90% of one card balance that is considered maxed out, and this is another deduction in your score (typically a big one)
Do not let all cards report with a 0 balance. Your cards will report as paid OK. but all having a 0 balance will reduce your fico score as well.
I sometimes only let 5-10 dollars report on one of my cards. This seems to work just as well as 1-9% of the balance for my scores.
Another post had a great point about utl round up. $1 reporting on a 100K limit is still 1% utl.
***edited because I am a terrible writer.
so even if you have a 0 balance on your card it reports? I suppose that is what I am asking.
I have been playing around with my scores since I opened my first cc in August 2012. I pay most of them in full every month before the statement cuts. A few I let balances hit...mainly because I'm paying Christmas stuff off on those. Anyways....when I let a 0 balance hit, it has always shown ok for that month on my credit reports. I assume they mark it ok because I was charging stuff and paying it off so there was activity. I'm sure it is different for other creditors but you can see what I have in my signature line. I don't know about the Amex cards though because they are so new they have not hit my credit yet.
I don't think there will ever be consensus, but I also have a feeling that is because it is going to vary between people. The best bet if it is a life and death matter is to play around with it beforehand and find a sweet spot.
It seems to me that the exact situations described in this thread are the reasons why it is less formulaic across all individuals than assumed. We know for a fact that FICO has the ability to bucket consumers based on certain characteristics, so is it completely ridiculous to hypothesize that utilization also depends on some sort of segmented calculation? For example, those with available credit stemming from 15 tradelines vs. those with available credit stemming from 1. This idea could be expanded into all sorts of categories including the dollar amount, the type (cards, vs. home equity vs. flex spending), etc. My thought, and I have no empirical evidence for this, is that my recent app spree probably changed my ideal utilization.
Maybe I'm very wrong, but if I'm not it will probably blow all assumptions on this forum out of the water. So lets hope I'm wrong
I am buying a house in a few months. Will a balance of 0 stop reoccuring reports to my credit?
Comenity Bank which holds a lot of store cards, will not generate statements or report if you have a zero balance. Most others do report if there is usage and zero balance.