FAQ Megathread (Got Questions? Look here before posting!)
Frequently Asked Questions ❔ Hacking makes your 3DS/2DS be able to play any region. You can play EUR games on a USA system and the opposite USA games on a EUR system. If you have issues with games loading try importing the seed first by locating the game within FBI > Titles > Selecting the game > Import seed. If that doesn't work try switching the games region to whatever it is with Luma Locale Switcher so if the game is USA then switch the region of it to USA and try loading it again. For certain games like Luigi's Mansion and Tomodachi Life you do need the right region for the game to work. No idea why and this is usually EUR systems trying to play the USA version of games. If you can't get games to work please comment below for help. ❔ Whatever happened to Freeshop? / Nintendo killed it, along with all other programs that get games from Eshop servers. They are permanently dead. ❔ Can we make a new Freeshop? / No ❔ Can I update to the latest update? / Yes. Update Luma to the latest stable release and then update your system software. ❔ How do I hack my 3DS? / Follow the guide here don't use a youtube video - https://3ds.hacks.guide/ ❔ I found a .CIA - now what? / Put it on your SD card and use an installer like FBI to install the game. ❔ Can I get banned for pirating games? / Yes and No. It has happened in the past, but it's been years since any bans. Nintendo does not know if your console is hacked. They do not know if you are playing games you installed from a .CIA. Don't cheat while playing online and you should be fine. 3DS is dead now anyway so Nintendo seems to have giving up. ❔ Can I update .CIA games normally? / Yes if the region of the game matches your console. For example a USA system and a USA game. If a game is prompting an update notice, try to update by normal means. If you get an error you can always download an update .CIA and install it through FBI. ❔ I installed DLC from a CIA file and went to the eShop, but now it's gone. What happened? / Some games like Super Smash Bros. check for your tickets every time you log into the eShop. If you haven't purchased the DLC legitimately, the ticket will be deleted. Redownload the DLC and it'll work again. You can also use a homebrew application called Faketik to restore it. ❔ If I have pirated games on my system can I use the eShop? / Yes, you have all functions just like if you never hacked your 3DS. You can buy and download games from the eshop. ❔ How can I install games if I don't have access to the SD card? / Use Boop (Use version 1.4.0 if the game you're sending is 1GB or larger) or FTPD. Boop is PC only while FTP can be done via PC or your phone. Boop guide and FTP Phone guide for Android and Apple. FTP is easy to use on PC just use a FTP app like Filezilla. ❔ If I have a pirated copy of Pokémon, can I use the Pokébank? / Yes. You do need to pay the yearly fee to use Pokébank and no you can't hack it to use it for free. You can also use PKSM on your 3DS or PKHEX on your PC for a free alternative. ❔ Can I upgrade my SD Card? / Yes. If you want to upgrade your SD card all you need to do is format the new card to Fat32 and copy the entire contents of your old card to the new one. Should just work like before, but now with more free space. Cards up to 256GB seem to work fine. 128GB and bigger SD cards should be formatted with 64KB clusters or else GBA VC Injects may have display problems. ❔ Can I pop in my SD card into a non hacked console and have my games playable? / No, it doesn't work like that. Even a hacked console can't use the same SD card you have in another system. Nintendo made the 3DS locked to a single system with the SD card. Meaning it only works on the one system it's tied to. ❔ Can I do a system transfer to another system and still have my CIA games? / Yes, but the other console needs to be hacked first. Hack the system you wish to system transfer to first (I would advise you to back up your saves with checkpoint before doing the transfer in case anything happens) Then do the system transfer. Once done use Faketik to restore your CIA installed games. ❔ I have some .3ds files how do I convert them to .cia? / If you want to do it on your 3DS use this guide. If you want to do it on your PC use this program. ❔ I have a 3DS game cartridge that I want to dump and install as a .cia how do I do it? / Guide here ❔ I want to backup or edit my GBA VC Inject save how do I get the save file? / How to Backup and how to Restore the save. HOW TO PLAY DS GAMES There's 3 ways to play DS games on your system. DS games cannot be a CIA file unless Nintendo released it as DSiWare.
Get a flashcard which has the most compatibility. Ask in the comments here for flashcard recommendations and how to use them.
Use DS Forwarders which place shortcut icons on your homescreen, but doesn't AP patch the rom like Twilight Menu++ does. You'll need to AP patch the roms yourself using this method.
Install Twilight Menu++ which is the best method if you don't want to buy a flashcard. Twilight Menu AP patches your roms on the fly before launching.
You've probably been hearing a lot about Bitcoin recently and are wondering what's the big deal? Most of your questions should be answered by the resources below but if you have additional questions feel free to ask them in the comments. It all started with the release of the release of Satoshi Nakamoto's whitepaper however that will probably go over the head of most readers so we recommend the following videos for a good starting point for understanding how bitcoin works and a little about its long term potential:
Limited Supply - There will only ever be 21,000,000 bitcoins created and they are issued in a predictable fashion, you can view the inflation schedule here. Once they are all issued Bitcoin will be truly deflationary. The halving countdown can be found here.
Open source - Bitcoin code is fully auditable. You can read the source code yourself here.
Accountable - The public ledger is transparent, all transactions are seen by everyone.
Decentralized - Bitcoin is globally distributed across thousands of nodes with no single point of failure and as such can't be shut down similar to how Bittorrent works. You can even run a node on a Raspberry Pi.
Censorship resistant - No one can prevent you from interacting with the bitcoin network and no one can censor, alter or block transactions that they disagree with, see Operation Chokepoint.
Push system - There are no chargebacks in bitcoin because only the person who owns the address where the bitcoins reside has the authority to move them.
Low fee scaling - On chain transaction fees depend on network demand and how much priority you wish to assign to the transaction. Most wallets calculate on chain fees automatically but you can view current fees here and mempool activity here. On chain fees may rise occasionally due to network demand, however instant micropayments that do not require confirmations are happening via the Lightning Network, a second layer scaling solution currently rolling out on the Bitcoin mainnet.
Borderless - No country can stop it from going in/out, even in areas currently unserved by traditional banking as the ledger is globally distributed.
Portable - Bitcoins are digital so they are easier to move than cash or gold. They can even be transported by simply memorizing a string of words for wallet recovery (while cool this method is generally not recommended due to potential for insecure key generation by inexperienced users. Hardware wallets are the preferred method for new users due to ease of use and additional security).
Bitcoin.org and BuyBitcoinWorldwide.com are helpful sites for beginners. You can buy or sell any amount of bitcoin (even just a few dollars worth) and there are several easy methods to purchase bitcoin with cash, credit card or bank transfer. Some of the more popular resources are below, also check out the bitcoinity exchange resources for a larger list of options for purchases.
Here is a listing of local ATMs. If you would like your paycheck automatically converted to bitcoin use Bitwage. Note: Bitcoins are valued at whatever market price people are willing to pay for them in balancing act of supply vs demand. Unlike traditional markets, bitcoin markets operate 24 hours per day, 365 days per year. Preev is a useful site that that shows how much various denominations of bitcoin are worth in different currencies. Alternatively you can just Google "1 bitcoin in (your local currency)".
Securing your bitcoins
With bitcoin you can "Be your own bank" and personally secure your bitcoins OR you can use third party companies aka "Bitcoin banks" which will hold the bitcoins for you.
If you prefer to "Be your own bank" and have direct control over your coins without having to use a trusted third party, then you will need to create your own wallet and keep it secure. If you want easy and secure storage without having to learn computer security best practices, then a hardware wallet such as the Trezor, Ledger or ColdCard is recommended. Alternatively there are many software wallet options to choose from here depending on your use case.
If you prefer to let third party "Bitcoin banks" manage your coins, try Gemini but be aware you may not be in control of your private keys in which case you would have to ask permission to access your funds and be exposed to third party risk.
Note: For increased security, use Two Factor Authentication (2FA) everywhere it is offered, including email! 2FA requires a second confirmation code to access your account making it much harder for thieves to gain access. Google Authenticator and Authy are the two most popular 2FA services, download links are below. Make sure you create backups of your 2FA codes.
As mentioned above, Bitcoin is decentralized, which by definition means there is no official website or Twitter handle or spokesperson or CEO. However, all money attracts thieves. This combination unfortunately results in scammers running official sounding names or pretending to be an authority on YouTube or social media. Many scammers throughout the years have claimed to be the inventor of Bitcoin. Websites like bitcoin(dot)com and the btc subreddit are active scams. Almost all altcoins (shitcoins) are marketed heavily with big promises but are really just designed to separate you from your bitcoin. So be careful: any resource, including all linked in this document, may in the future turn evil. Don't trust, verify. Also as they say in our community "Not your keys, not your coins".
Where can I spend bitcoins?
Check out spendabit or bitcoin directory for millions of merchant options. Also you can spend bitcoin anywhere visa is accepted with bitcoin debit cards such as the CashApp card. Some other useful site are listed below.
Mining bitcoins can be a fun learning experience, but be aware that you will most likely operate at a loss. Newcomers are often advised to stay away from mining unless they are only interested in it as a hobby similar to folding at home. If you want to learn more about mining you can read more here. Still have mining questions? The crew at /BitcoinMining would be happy to help you out. If you want to contribute to the bitcoin network by hosting the blockchain and propagating transactions you can run a full node using this setup guide. If you would prefer to keep it simple there are several good options. You can view the global node distribution here.
Just like any other form of money, you can also earn bitcoins by being paid to do a job.
You can also earn bitcoins by participating as a market maker on JoinMarket by allowing users to perform CoinJoin transactions with your bitcoins for a small fee (requires you to already have some bitcoins.
The following is a short list of ongoing projects that might be worth taking a look at if you are interested in current development in the bitcoin space.
One Bitcoin is quite large (hundreds of £/$/€) so people often deal in smaller units. The most common subunits are listed below:
one bitcoin is equal to 100 million satoshis
1,000 per bitcoin
used as default unit in recent Electrum wallet releases
1,000,000 per bitcoin
colloquial "slang" term for microbitcoin (μBTC)
100,000,000 per bitcoin
smallest unit in bitcoin, named after the inventor
For example, assuming an arbitrary exchange rate of $10000 for one Bitcoin, a $10 meal would equal:
For more information check out the Bitcoin units wiki. Still have questions? Feel free to ask in the comments below or stick around for our weekly Mentor Monday thread. If you decide to post a question in /Bitcoin, please use the search bar to see if it has been answered before, and remember to follow the community rules outlined on the sidebar to receive a better response. The mods are busy helping manage our community so please do not message them unless you notice problems with the functionality of the subreddit. Note: This is a community created FAQ. If you notice anything missing from the FAQ or that requires clarification you can edit it here and it will be included in the next revision pending approval. Welcome to the Bitcoin community and the new decentralized economy!
Let's Begin. Lately I have been testing out different ways to clean up the PS3 XMB and have everything looking clean, concise and where it should be. PS3 4K Pro does an amazing job of this, organising our XMB into specific categories, and it is even more helpful if we have webMAN installed. A genuine must have for any CFW PS3 user. STEP 1: DOWNLOAD & INSTALL 'PS3 4K PRO' TO YOUR PS3 CONSOLE: After installation, your XMB should now look something like this: PS3 XMB AFTER PS3 4K PRO Whilst cleaning up my XMB, I noticed that the .ISO files (of games that you legally own & have ripped to your system) will show up in a different place, than if you were to buy a PSX Classic from the PSN store and install the downloaded .PKG. This is just a reflection of how the PS3 system handles different types of files, and not at all a fault of PS3 4K Pro. Nonetheless, it is something that ideally we would attend to for consistency. With a bit of effort, we can avoid potentially having half our games in one location of the XMB, and the other half of our games in another location. We do this by creating our own .PKG's instead of simply dumping the PSX Disc's .ISO/.BIN file to the console. I generally prefer to convert all my physical PSX games to PKG format for my PS3. Aesthetics being the main reason why, and you will see why shortly. Now that PS3 4K Pro is installed & our XMB is looking nice, it's time to begin preparing our.PKG. STEP 2: DOWNLOAD & INSTALL 'imgburn' TO YOUR PC: For the sake of this guide, I will be using my copy of Crash Bandicoot [PAL] for PS1. Insert your PSX Disc into your PC. When your PC reads the disc, the disc'sVOLUME_NAMEwill be the PSX Game'sGAMEID. Take note of this GAMEID, and create a folder on your desktop named (GAMETITLE_GAMEID).eg. CRASH_BANDICOOT_SCES-00344. Finalised CRASH_BANDICOOT_SCES-00344 folder. Open 'imgburn' and select 'Create image file from disc. In the 'Destination' field, navigate to the directory of the newly created GAMETITLE_GAMEID folder on your desktop. imgburnwill automatically name the created.BIN&.CUEfiles to 'GAMEID'.BIN&'GAMEID'.CUE. Sweet! we've converted our physical PSX disc to a .bin and a .cue! STEP 3: unRAR the 'psx_classics_new_2' file we downloaded and place the resulting folder in a memorable place, such as My Documents: Open 'PSX2PSP' (found in the 'PSX2PSP 1.4' folder). Open 'Convert Menu'. Click the 3 Dots near 'ISO/PBP File' and navigate to the .BIN we created earlier with imgburn. In 'Output PBP Folder' navigate to the working folder we have been using on the desktop (in my case, CRASH_BANDICOOT_SCES-00344. Click 'Convert. Inside the 'Working Folder' we have been using, PSX2PSP will create a folder named GAME_TITLE, with an EBOOT.PBP inside it. This file is essentially the'foundation'of our.PKGconversion. STEP 4: Copy the EBOOT.PBP file we generated with PSX2PSP and place the EBOOT.PBP file in the root of the 'psx_classics_new_2' folder.STEP 5: Drag the generated EBOOT.PBP file onto '_Fix_EBOOT.PBP_.exe': This .EXE will unpack the EBOOT.PBP and place the required files into the .PKG folder (by default, it is found in the same folder as '_Fix_EBOOT.PBP_.exe'). During the process, you will notice an ISO.BIN.DAT and a ISO.BIN.EDAT file appear. Wait until both of these files disappear as well as the original EBOOT.PBP. It might look like the process has frozen (especially if you are packaging a big game!), but be patient, it's just taking a little time to process the files! Once all three files have disappeared, COPY. COPY.COPYYYY the PKG folder in 'psx_classics_new_2' to the original working folder on the desktop we have been using throughout this tutorial. (in my case, CRASH_BANDICOOT_SCES-00344. Rename this folder to the GAMEID we took notice of, and inserted into our folder name before. In my case, I renamed my folder to 'SCES00344'. OPTIONAL STEPS: CUSTOMISING THE XMB FEATURES OF THE PKG: All PSX Classics PKG's follow the same folder structure. USRDIR, ICON0.PNG, PARAM.SFO, PIC1.PNG, & PS3LOGO.DAT. Optional additional files are PIC0.PNG(For PS3),PIC2.PNG(For PSP) and SND0.AT3. ICON0.png is the icon of the PKG that shows up on your XMB when it is installed.PIC0.png is the description of the game.PIC1.png is the XMB background that will show when our installed .PKG is hovered on.SND0.AT3 is the background music that will play when our .PKG is hovered on. CREATING AN ICON0.PNG IN PAINT.NET First, let's make an icon(ICON0.PNG). Extract the contents of the PS3 IconTemplates.zip and place them in the 'psx_classics_new_2' folder. In this archive are the templates which we will use. Open [PS1] ICON0.png in paint.net. Next, we must paste in a PSX Cover. Here is one helpful site I found, with a nice collection of PSX covers. Download a cover for your game and open it in paint.net, alongside the ICON0.PNG template. Highlight the image, minus the PlayStation vertical banner. Use 'CTRL+C' to copy the contents of the canvas. Navigate to ICON0.PNG in paint.net and press 'CTRL+V' to paste the PSX Cover we copied earlier. If it says 'The image being pasted is larger than the canvas size' just select 'keep canvas size'. Manipulate the image so that it fits into the empty box of ICON0.PNG. When you are happy with your icon, select 'Save As'. Once again navigate to the working folder on the desktop, and name the newly created image ICON0.png. Just select 'OK' if any dialogue shows up.REMEMBER: THE FILE MUST BE SAVED AS A .PNG!! Finished ICON0.PNG Cool! We have our own 'custom' icon. Want to make a Background?: Typically, the background (PIC1.PNG) for any PS1 Classic bought off the PSN Store looks like this: Default PIC1.PNG. If you'd like to use the standard Background, just copy the [PS1] PIC1.PNG template to your working folder and rename it to PIC1.PNG. Easy done!! However, If you'd like to customise the background a bit more, we can either source an image off the internet, or delve into the PSX disc a bit further to find some image files. The success of this differs from game to game, and sometimes it really is just easier to find an image on Google.I'll offer a few tools you can use to delve a bit further into the PSX disc, if you would like to. This gets a bit messy, and sometimes might be more trouble than it's worth. It borders on reverse engineering, and is NOT necessary for creating the .PKG. However if this peaks your interest, like it did mine, then read on. Without being an expert in PSX games, I have found that within the .BIN file there are files such as .HED, .WAD, .TIM, .BOB, .SPR, .XA, .NSF files... the list goes on. What on earth are these files?? It seemed to me that some files are solely dependent on the developer that produces them. Some tools to unpack these files are made on a per game basis, i.e CrashEdit for .NSF files. This evidence led me to the conclusion that although finding original files in the original PSX discmay be rewarding, it really depends on1.) Which game it is and 2.) How much time you are willing to sacrifice extracting & investigating files.Some files found in PSX .BIN / .ISO files:WAD- Container, needs a corresponding .HED to be opened. TIM - Screen Image Data.XA - Music Playlist File. STR - Stream (Movie) File. Usually the cinematics we see in the games. Sometimes handy for making SND0.AT3.BOB - Bob Ray Tracer software. Stores a bitmap image. NSF - Literally a 'Naughty Dog Software File' XD. Mainly found in Crash Bandicoot games. Tools for unpacking these files:WADTool v1.0 - (http://www.mediafire.com/file/ccy63r49c268jdf/wadtoolv10.7z/file) - Usage Notes: You will be asked to pick a WAD file only, which means that HED should be located in the same directory. hedwadtool - (https://github.com/DCxDemo/WAD-Tool/releases) - See above. PSMPlay - (https://www.zophar.net/utilities/psxutil/psmplay.html) - For viewing .STR, .TIM, .MOV, .IKI & .XA files. XnView - (https://www.xnview.com/en/xnview/#downloads) - For vieweing .BOB files & converting them to .BMP format. If your .BIN has .BOB files, use this. *Download 'XnView Extended Setup' To extract the .NSF Files in Crash Bandicoot games, downloadCrashEdit. A BRIEF GUIDE ON UNPACKING PSX GAMES: Usually if I'm going to unpack a game, I like to create a folder called GAMENAME_UNPACKED, to avoid confusion with files. Use CDMage to open the .BIN we created. CDMage will ask you what image type the .BIN is. Select 'M2/2352 track'. Right Click on 'Track 1' under 'Session 1', and select 'Extract files'. Point the extraction to the GAMENAME_UNPACKED folder, which should be insidethe folder on the desktop we have been using throughout this guide. From here, I used CrashEdit to analyse the .NSF files within the .BIN. The Crash Bandicoot PSX Disc does have a .WAD on the root of it, however there was no corresponding .HED file and I was unable to open it. Weird. As I said before, this process differs game to game. With CrashEdit, I couldn't really find any images or audio that would work well on the PS3 XMB. From here, I'll either Google an image or use the default PIC1.png for my Crash Bandicoot .PKG. As the Crash Bandicoot N'Sane Trilogy has been released for the PS4,I am going to use a promotional .JPG that is already in 1920x1080 format for my PIC1.PNG.Convert any 1920x1080 file to the correct format by simply opening the downloaded.JPGinpaint.net and saving it as PIC1.PNG in our working folder. Nonetheless, not all games are the same, and for the sake of the tutorial, I will demonstrate a quick way to extract .BMP files from another game I own, Tony Hawk's Pro Skater. Use imgburn to create a .BIN like earlier in the tutorial. Open the .BIN in CDMage, select the correct parameters, and extract the contents of the .BIN to a folder named something like THPS_UNPACKED. Create a new folder inside THPS_UNPACKED and call it WAD. Use WADTool v1.0 to extract the WAD created by CDMage & extract it to THPS-UNPACKED -> WAD. Look for any files that may be of interest. Usually menu backgrounds look great as XMB backgrounds. Here I found main_h.bmp. It will make a good background. Remember to open any file you find inpaint.net, save it as a .PNG and rename it appropriately (PIC1.png). Save it to the working folder on the desktop.Cool! We have our own 'custom' Background. Want to make a Description? (PIC0.PNG):In thePS3 Icon Templatesarchive we downloaded earlier, there are two.TTF files.Play-Bold.ttf, and Play-Regular.ttf. Right-Click each of these items and select 'Install', granting Administrator privileges if necessary. These fonts are the official fonts used in any PIC0.PNG image from any official PSN Store PKG download. Use these fonts to create your ownPIC0.PNG. If you'd like to create your own PIC0.PNG, in paint.net open 'PIC0.PNG' found in the PS3 Icon Templates folder we downloaded earlier. You can use the Play-Bold.ttf & Play-Regular.ttf fonts to accurately replicate the authentic font that is found on the XMB of any official PSN Store PKG download. I found Font Size 72 worked well for the game titles, depending on how long they were. Experiment with the font size for your game descriptions to fit in as much text as you can. If you'd like to use the authentic game descriptions, head over to https://psxdatacenter.com/. This website also provides a nice alternative to the PSX Covers website mentioned earlier, with a great collection of cover art for PSX games. Just Right-Click on any Cover-Art you see on the site & Select 'Save Image As'. Navigate to the working folder on the desktop and Save the .JPG.) You can use this file when creating your ICON0.PNG, just remember to export the final product as a .PNG file!! To find Game Covers at https://psxdatacenter.com/, Find the GAMES LISTS button at the top of the page and navigate to either NTSC-U, PAL, or NTSC-J.- Within these directories will be the games we are searching for. Simply click the green 'INFO' button on the left of the desired game. On this page will be a Game Description that we can paste into our PIC0.PNG project in paint.net, and also a PSX Game Cover. We will likely have to manipulate the text to make it fit in the PIC0.PNG template. When it comes to text boxes in paint.net , the Arrow Keys and SHIFT+ENTER are your friend. If you would prefer to use some prebuilt PIC0.PNG's,here is an archive of PIC0.PNG's for 138 PSX games not available on PSN Store. Unpack the archive, and browse the PIC0 folder. If there is a PIC0.PNG that corresponds to the game you are converting, copy the file to the Working Folder archive on the desktop, and rename it PIC0.PNG. (HINT: You can find the games you are searching for, by looking for the correspondingGAMEIDin thePIC0folder). Cool! We have our own 'custom' Description. Want to make an XMB audio file?: I like to use the native audio from the games, albeit sometimes the files are a bit hard to find. If you have an easily accessible song / audio file in mind, go ahead and prepare that particular file in .wav format. Extracting audio files from .STR files: If your .BIN file has .STR files inside them, these files are a great way to extract the audio we will need for our Menu Music. Open any .STR file in PSMPlay to view them. When the PSMPlay interface loads, Right-Click the 'i' icon, and select File -> Open Media File. Load your .STR (INTRO.STR might be a good place to start). You can use PSMPlay to preview the .STR files, although it does crash sometimes. If it ever does, just CTRL+ALT+DEL & end the process then restart the program. If you are happy to use the audio from the loaded .STR, Right-Click the 'i' icon again, this time selecting 'Output WAV'. Browse to the working folder on the desktop and name the file 'SND0_WAV.wav' A Dialogue box will appear in the top left. Click 'Start' and wait for PSMPlay to do it's thing. Creating .WAV from .PSF Files: Alternatively, if there are no .STR files on your .BIN, or, if you don't want to reverse engineer a video file just to get an audio file (Hey, this is supposed to be fun remember? :P), Zophar's Music Domain has an awesome collection of .PSF(PlayStation Audio) files. The files on this site are .RAR downloads, and they essentially contain the background music for all the levels of any game. They are really helpful for making menu music. .PSF files cannot be opened by Windows or macOS natively, so to be able to utilise and convert them, we're going to need to use a piece of software called foobar2000. In order to read and convert .PSF files, foobar2000 also needs this .PSF decoder. Go ahead and download/install both of them. Now, download a .PSF archive from Zophar's Music Domain and unRAR it with winRAR. Select all the files in the directory and drag them across into foobar2000 (once the .PSF decoder plugin has been installed). Now you can peruse the .PSF contents of the disc and choose appropriate audio for your .PKG. Once you have selected a .PSF file you are happy to use, Right-Click it and select Convert -> Quick Convert. Make sure 'WAV' is selected, then click Convert. Navigate to the working folder on the desktop and select OK. foobar2000 will convert the .PSF to .WAV. Now, we must convert our .WAV file to a file that the PS3 can recognise when bundled into a .PKG. This file is called SND0.AT3.Extracting audio from a .STR on the CRASH_BANDICOOT disc seemed impossible, and even downloading the .PSF archive for this game did not give me the title music I really wanted for my SND0.AT3. I still ended up here. You know, Sometimes, a good old Youtube-MP3 converter is the 'best' Last Resort. Many of them exist, it really shouldn't be too hard to find one. Do with it what you will, and use the tools I have provided to generate a .WAV file. (Manually converting to.WAVnotABSOLUTELYnecessary, asGoldWavecan import.mp3files, this is explained shortly). Generating a SND0.AT3 from our .WAV files:There is already a terrific guide on creating a SND0.at3 file. Go ahead, make an .AT3, then come back here. The guide mentions looping the .AT3. with GoldWave AT3 Looping Tool. I definitely recommend doing this extra step, and making your .AT3 loop. Even if you did not generate an audio file from a.STRfile or a.PSFfile, you must still follow this guide to create yourSND0.AT3. Once you have a SND0.AT3, place it in the working folder on the desktop. Now we have all the files we need to make our .PKG! The 'Working Folder' on the Desktop we have been using. Here are our generated ICON0.PNG, PIC0.PNG, PIC1.PNG & SND0.AT3 files. Double-Check, then Triple-Check that all the files you're about to move are to your liking. Then one by one, copy your created ICON0.PNG, PIC1.PNG, PIC0.PNG & SND0.AT3'sto the 'PKG' folder in the the 'Working Directory' that we have been using on the desktop (The one we actually renamed to 'GAMEID') earlier on in the tutorial. (If you're lost, it's the highlighted folder in screenshot above.) Theoretically, our directory should now look like this: USRDIR, ICON0.PNG, PARAM.SFO, PIC0.PNG, PIC1.PNG, PS3LOGO.DAT & SND0.AT3. Finalised 'GAMEID' folder. It is very important that this folder is named 'GAMEID'! Dude, this has been so much effort. Is it even going to look good? For one last check of how our images are going to look on the XMB, we can use the RetroXMB_Creator.exe we downloaded earlier. Open RetroXMB_Creator.exe and navigate to the 'Preview' Section. From here, Right-Click the Background, and options such as SET PIC0.PNG, SET PIC1.PNG, SET ICON0.PNG(Right-Click on the PSX Icon for this option) & SET SND0.AT3 appear. Unfortunately, We cannot preview the .AT3 files here. You can play them back through GoldWave, but remember that because we changed the Sample Rate in accordance with theSND0.AT3 guide, the audio file will sound slightly different to what you're used to hearing when played back on a PC. Do not worry, this difference is compensated for on the PS3 Console & your SND0.AT3WILL sound 'normal'. Once you are happy with how your .PKG is going to look, proceed to STEP 6: https://preview.redd.it/f3la9c5rqbp51.jpg?width=638&format=pjpg&auto=webp&s=053375857ba30a2666e8577fcc697187a517fd5e STEP 6: Pack the contents of the generated folder (In this instance, 'SCES00344') into a PKG with TrueAncestor PKG Repacker: Extract TrueAncestor PKGRepacker.zip to a memorable folder, such as My Documents. Once this is completed, copy the 'GAMEID' folder we created earlier to TrueAncestor_PKG_Repacker_v2.45 -> game. Run 'repacker.exe'with Administrator privileges, and select option 2. Custom Pack PKG. TrueAncestor will then ask you for the .PKG type. Press '5' for 'PSOne Classic'. The GAMEID folder we just pasted to TrueAncestor_PKG_Repacker_v2.45 -> game should show up as Option 1 in the Game Folder List. Press 1, then hit ENTER. TrueAncestor will soon say [\] Please follow this ContentID sample:. Copy the *EXACT** code it provides, and paste it where it says [?] Please enter ContentID / A to abort: Press ENTER, and the process will progress. Entering the ContentID in TrueAncestor TrueAncestor will then announce [*] It will take a few minutes, please wait.... Do so, and come back when it's done. It shouldn't take TOO long. TrueAncestor will place a completed .PKG file in TrueAncestor_PKG_Repacker_v2.45->pkg. Copy this newly created .PKG(It will be something like 'TA9000-SCES00344_00-0000000000000000.pkg') and navigate to the 'Working Folder' we have been using on the desktop. Create a new folder inside it called PKG_FINALISED then paste the .PKG here.STEP 7: Re-sign your newly created .PKG with PS3 Resigner Master Tool: Extract the PS3 Resigner Master Tool archive we downloaded earlier to a memorable location, such as My Documents. Copy the newly created .PKG we pasted in 'PKG_FINALISED' to the 'input -> pkgs' folder of PS3 Resigner Master Tool. Go back twice, and select 'resign_windows.bat' Hopefully, you should be greeted with a message similar to this: After a while, the tool will finalise & sign your .PKG. You will know the file is completed when PS3 Resigner Master Tool reports something similar to: pkg signed!1 file(s) moved.ps3xploit_resign:Output files:PKGS:.\output\pkgs\TA9000-SCES00344_00-0000000000000000.pkg_signed.pkg Press any key to continue . . . Navigate to PS3 Resigner Master Tool->output->pkgs. You will notice that '_signed' has appeared next to the title original title name. STEP 8: Copy the newly created (& signed) .PKG back to the 'PKG_FINALISED' folder we created earlier: For safety's sake, duplicate it once we have pasted it back. Name a copy GAMETITLE_GAMEID_SIGNED. eg.CRASH_BANDICOOT_SCES-00344_SIGNED.PKG.That's it! We're done! Move the generated file we just signed over to your PS3 console via FTP or USB Transfer. Place it in dev_hdd0 -> packages. Go to Package Manager, look for your .PKG and install it. https://preview.redd.it/uxc24658icp51.png?width=1920&format=png&auto=webp&s=d38393f987786c447696103bee9cc2d218ffddac Once installed, your PSX game will show up in the PSX Games folder inside XMB Xross Media Bar folder. Have fun booting your PSX games right from your XMB! If you encounter any errors during the installation (or boot) of your .PKG, look back on the compression rates of the files we have imported and make sure they are correct. By default, they should be. Also, ensure the .PKG was resigned properly (Especially for HAN/HEN users). Converting PSX Discs to .BIN / .ISO format can be very finnicky sometimes. Double check the imgburn website to confirm the correct transfer rates and parameters for PSX CD-ROMS. Other issues may be caused by PAL / NTSC indifferences or incorrect .BIN to EBOOT.PBP conversion. For those of you who already own a copy of Crash Bandicoot and would like to try this method out without having to first generate your own files or use the stock ones, I have uploaded the files that were created throughout this tutorial to a MEGA folder here. For those who'd like to see this process in action, I have also posted a video tutorial here. :)(https://www.youtube.com/watch?v=grobxkEmR2E)
Transcript of Real DeFi for Healthcare Update #2 "Asset Pools" from Solve.Care CEO Pradeep Goel (02.10.20) [Part 2 of 2]
Read Part 1 of this Transcript here: https://www.reddit.com/solvecare/comments/j8ykbb/transcript_of_real_defi_for_healthcare_update_2/ Watch this on YouTube: https://www.youtube.com/watch?v=rjsYThTQ6EY Now, in the big scheme of things, the asset pool is one of the several components of the DAO. So the DAO, of course, is a community-run organization, so the community under which there are Governance Committee, community experts, and distributors. These roles we will discuss next week. https://preview.redd.it/pjzpjafb1es51.png?width=1366&format=png&auto=webp&s=93a4f3ab3cdfd8f17dab3b2824eb87e62b4c4d3a But consider, as the community that governs and owns and operates and participates in the DAO, you have asset pools which are funded by the community, which are governed by the Oracles and pool tokens. And then last but not the least, we have the Constitution, which defines how governance can and cannot happen, what government schemes are permitted and how government schemes are implemented, and how to update them and how to manage them. And most importantly, the Constitution also tells us what the pool's purposes can be, what type of pools will we permit, and what type of pools should the DAO not permit. This is a DAO designed to advance healthcare around the world. So the Constitution calls for 'Do no harm,' 'Always take care of the patient's needs,' 'Never ever finance a product or service that can harm the patient,' 'Empower the physician,' and so on. So we will talk about the Constitution in the upcoming webinar. There is a lot to cover. The DAO has been thought-through and explained in great detail in the Whitepaper as well. But I'm going to continue to run a series of webinars to introduce you to the DAO construct with the expectation that you, as a community member, as the ultimate governor of the DAO, you will understand and ultimately manage and make this DAO a truly global solution for healthcare. And having proven it in healthcare. I expect that others will copy or will leverage this DAO in many other sectors. https://preview.redd.it/3r1qd8ce1es51.png?width=1366&format=png&auto=webp&s=129b9df85a7b0c20c1d3ae9027aee1d7ef3f9fd5 So the asset pool life cycle—I've covered that a little bit earlier—is: (1) a pool proposal is submitted—it can be submitted by most anyone. (2) Initial smart contract review tells us the proposal is complete and meets the constitutional requirements. And if the contract score is not high enough or flags the need for committee review, then Governance Committee or expert review occurs where they must vote on it to push it forward. (3) The committee voting is a step through which all the governance token holders vote and approve the pool. They must approve the set of parameters in this vote. If the parameters are not acceptable, it goes back to Step One because you cannot alter a single parameter. The pool is a package of parameters that community accepts or declines, and which is why a committee review can help the proposer optimize and refine the parameters before the community votes on them. Once a pool is voted into creation, the parameters are fed into the pool contract which activates the pool. (4) The activation of the pool allows contribution of the assets, which then generates the PITx automatically. (5) The pool is used to distribute products through distributors. Distributors have a whole series of requirements and a smart contract for distribution that they must meet, including collateral, and so on. All payments are required to be fed back into the payment contract which is monitored by the Oracle and then the Oracle is calculating the daily value of the PITx which allows the PITx holder to redeem the principal, interest, and income. So, this is a complete picture of an Asset pool Lifecycle. And what I didn't draw here that you can also add on is at certain point with the—either automatically because the pool parameter calls for automatic liquidation or by community vote—the liquidation contract is triggered, which then takes all the assets and distributes them to the PITx token holders and closes out the PITx redemption contract. https://preview.redd.it/6rhb8ich1es51.png?width=1366&format=png&auto=webp&s=ad4b7fea37859c7af676a58ea37a5a5e3e481c98 Now, in all of this, there is a role for the SOLVE token that we have designed that the DAO is using because this is the most robust, and simple and effective model that that we could come up with. The first is that when the proposals are made for asset pool creation or distributor proposals, there is a mechanism by which the proposal can be submitted to a smart contract, but it requires a certain amount of SOLVE tokens to be staked. And if there are too many proposals there, some proposals can be boosted, and we put a limit of boosting to 10 proposals at a time so community is not overwhelmed with tremendous amount of boosted proposals. The highest boosted proposals will get the committee review first. Also, the governance tokens are only obtainable through mining the SOLVE token. So in other words, proof of stake or staking SOLVE gives people the governance token and the governance voting rights. The pool collateral can be and by default is in SOLVE tokens. And if the distributor does not have enough collateral, they have to deposit more before they can withdraw assets. And the pool payments, by and large, at least those pools that are being distributed by Solve.Care to the market, such as the consumer device financing, all accrue back in SOLVE token, which then converts it back to whichever coin is needed, stablecoin or not. So these four functions are proposal and boosting proposals, mining for governance tokens, depositing pool collateral, and collecting pool payments are currently proposed to be done in the SOLVE token. This is not to say that other tokens won't be accepted in the future. But this is the launching framework for the DAO. Now, the pool will generate a well-governed pool. Properly constructed, it will not only pay interest, it will also generate income [to the degree generates] $1 in income or a million dollars in income net of interest payment to the asset contributors. https://preview.redd.it/0ftp2b4n1es51.png?width=1366&format=png&auto=webp&s=13b25bccffc7c94d1949ced890dcef1fa1317bdf The proposal is for that income to be shared across these five functions. The bottom two functions are mandatory. There has to be a certain income attributed to the DAO Treasury. That Treasury then allows for functioning and payment from the Treasury to community experts, and for other purposes of the DAO. There has to be a certain allocation towards taxes and administration and paying the community experts, if there be such a need. And then there are the distributors who actually distribute the pool products and ensure collection of payments. There is the asset pool contributors who are putting the money in, who should get a piece of the income. And then there's certainly the Governance Committee, you, who also shares in the pool income. And the idea is that while the asset pool contributor has interest income, they should have an equal stake or a meaningful stake in the income of the pool beyond the interest. And they should be as engaged in the governance and the promotion and success of the asset pool beyond just collecting interest. So the proposal is approximately one third of the income to go to the governance community; and a little bit less going to the asset pool contributors since they already have a meaningful APY attached in terms of interest; the distributor having a meaningful interest in the income because they are the ones who are working with the end user—the doctor, the patient—and they're the other ones who are responsible for timely collection of payments. Even though the collateral is at risk, certainly we want distributor to have a meaningful interest in managing the risk. And then taxes/administration/community experts and treasury contribution adds up to 100%. These parameters are the four parameters. They are subject to community governance and you could change them. The community might vote to double the governance token holder component, and that would be well within the community's rights. As long as the pool can still perform successfully, it's the community's decision to decide how much of the income goes to whom. https://preview.redd.it/wmwkf5js1es51.png?width=1366&format=png&auto=webp&s=f0c8e21b4b850b8909efae725486f169c2c2b047 So having said all that, there are lots and lots of asset pools, right? The notion here is that in this model of an asset pool—that is, community-governed, community-approved, community-funded, automatically measured, and easily demonstrated in terms of what is functioning, how its performance is—it makes sense to think about all kinds of different asset pools as possibilities, not as necessarily to be done now, but certainly possibilities. So the one clear pool there is a lot of conversation, a lot of work being done, is around the idea of a consumer medical device financing—essentially the ability for doctor or physician to finance necessary remote monitoring devices that they need from an ongoing care and chronic disease management perspective, and to have a monthly recurring payment for that device lease or financing model, and there's a tremendous amount of work being done there. But there are other pools which can feed professional device and facility financing, who can do cash flow accounting and receivable lending, who can underwrite practice insurance risk, pools who can provide consumer supplemental insurance for specific disease types. You could even fund drug research and discovery, which those are very very large initiatives of the few, but as we see in the COVID-19 world, the centralized healthcare doesn't work as well as it should have worked, so why don't we look at going beyond small pools to very large pools, and this is going to be your job to figure out at what scale does the DAO want to offer these very advanced pool types. The point is it's the community's purpose and function to decide what scale and size and purpose of the pool to launch. But this is just a way to get you started. This is a way for you to start thinking and start analyzing the pool types, and as an active governor of the community, it will be your function to choose which pools you approve and which pools you deny. https://preview.redd.it/7lvvnd1u1es51.png?width=1366&format=png&auto=webp&s=5ecbbf097dbe679eba7b3ca946dc4383d1a09fa7 Let's talk a little bit now about a hypothetical illustrative pool that we have spent a lot of time creating a model. Now this is not a 100% accurate dataset because there's no way to be 100% accurate at this stage, but it is a projection. It's a template. It's a pro forma. The notion here is to finance devices that cost an average about a little bit less than a hundred dollars to buy and to ship, and devices would be like a bluetooth-enabled blood pressure monitor or an oxymeter or a glucose meter, as an example. What we did is we did some research to figure out what kind of a device cost it would be—it's a cost-per-device bought in bulk; what the shipping and handling costs are, what the breakage and loss prevention insurance for the devices, and we end up with about a $92 device. We then looked at the market and said what would the monthly lease revenue for this device will be? And it turns out to be around $15 per month, including certain capabilities to connect the device to a software, and the software to capture data from the device in a secure way and deliver it to the physician. In our [world], that would be the Global Telehealth Exchange. The device plus the Global Telehealth Exchange connection for the device yields a $15 a month payment. Let's imagine that 10 percent of the device payments are not made. This is very high because typically if the physician is leasing the device, then we expect that to be much lower number, but let's use this as a baseline. Because remember we want to be able to have the Oracle measure the performance against a baseline so let's base -10%, which means that the device would yield an annual revenue of $162 dollars, adjusted for defaults. And let's assume that the distribution costs of signing these leases and delivering the device and tracking the utilization of the device and so on, is about 22%, and this is also based on actual negotiations, so that leaves the pool a net revenue of 126 dollars for every 92 dollars lent. And that would then get distributed as eleven dollars of interest payments at 12% APY. Eleven dollars and nine cents would be paid out by the pool to those who contributed their assets to the pool. $92.40 is the return of principal, and that leaves the pool with a net income per device of $22.87, assuming that there the default rates are 10%. If the default rate is 9, $22.87 might become $23. The point is the Oracle calculates that on a regular basis so that one can say, 'against this baseline, how is the PITx(v) doing?' And that is the measure we are after. We're after extremely accurate community-governed, community-monitored measure of pool performance that does not require any more complexity than knowing that the PITx(v) is above or below the base par value, if you will. https://preview.redd.it/tmfm5maw1es51.png?width=1366&format=png&auto=webp&s=2c8ae6bd7672710af5d6a3ff39000db5173513a7 We took that model, and we applied a half a million device financing framework in our estimate, and this is based on physician groups that we are talking to today, and what it ends up being is the pool would be a 46 million dollar pool. It would generate gross revenue of $81 million; pay out distribution cost of $17.8 million for those who actually will get these devices shipped and deployed at the the patients' physicians' office; leaving the pool a net revenue of $63 million against a $46 million contribution, and after returning the $46,200,000 off the principal and paying $5.54 million of 12% APY to the contributors, the pool would have a net income of $11,436,000, which would then be shared across asset contributors—they would get an additional $3.2 million, so their return goes from 12% APY plus $3.2 million; it certainly pushed them close to 20% return. The distributors get $2.2 million for keeping the default rate at ten percent. If the default rate is at nine, they get an increased share, obviously. And the governance token holders community, get a $3.7 million distribution from that pool for being the good governors of this. Leaving $2.1 million for taxes, administration, community, and the treasury—all of this gets deposited in the DAO Treasury from where all the legal, regulatory, administrative costs are paid, and if those costs are lower, then they get distributed to the other three parties. So this is an example of how an asset pool would operate in a pool governance model of community and the Oracle, and how you could take that model and apply it to all kinds of projects. And this is just a financing model. There's also an underwriting model along the same lines. But the point is that the community is free to use this as a template to think about how this pool should be configured and how the pool should do. https://preview.redd.it/l219n9412es51.png?width=1366&format=png&auto=webp&s=51e7017d899f6f86be523d2c33b5decfbe1076fc Now, all that part is cool, but this is the best part. What I really like about the model that we have developed for asset pool governance is this: The PITx reflects every single hour or every single minute or every day depending on how frequently the Oracle updates the value which reflects the performance. So on Day 0, the moment the first dollar is lent out, there is a certain payment schedule that creates an accurate value to the pool so a dollar contributed to the PITx is worth a dollar. The moment the dollar is lent out, there is a certain PITx value increase. The lending of the dollar creates a payment schedule, which has now an anticipated income. As the weeks and months progress, the actual principal returns and payments and defaults continue to show up in the PITx value increase. And if everything tracks according to formula, at the end of the year, the PITx(v) should be $1.16 in this formula, where the assumptions are that we have basically funded three devices, and we are paying out 12 percent interest from the pool, the income that is coming into the pool is 15 dollars per device per month, the lease default rate is 10%, the distributors get 22% of the post-default revenue, and the expenses are 5%. In that model, you would see on a regular basis the accrued interest and earned income, or the pool income and the share of the PITx, and that one single value PITx(v) in the right column is a simple indicator of how the pool is performing, and whether it is meeting or exceeding expectations. https://preview.redd.it/7gk5kfv42es51.png?width=1366&format=png&auto=webp&s=bc41ec6fd528dfafe6823caaf98ba9dbe099427b So in a nutshell, we are focused on healthcare; we always will be at Solve.Care. And the DAO community for DeFi has enormous opportunities to finance healthcare projects whether they are proposed by Solve.Care or proposed by a distributor in Vietnam or a community member in China, the DAO community will decide which projects to approve. So it's a huge sector, and there is ample opportunity for a very, very, very large and effective DAO to exist and to succeed, but it does not prohibit us from allowing this DAO to function in other sectors as well. The model, fundamentally, is not sector-specific, but the Constitution requires that the DAO be focused on the best interest of the patient and the doctor. But if we want to, as a community, you choose to allow other types of projects to come in or bifurcate and create a DAO for non-healthcare on the same model, that sould not be terribly difficult. So with that said, I've given you what I hope is a complete picture of how asset pools are conceived, evaluated, approved, contributed, distributed; how payments are collected; risk is measured; and ultimately, how they are redeemed. The entire model is implemented through a series of schemes. Those schemes are a set of smart contracts interlinked, and with appropriate configuration parameters, and this is being the effort and the innovation of the Founding Committee supported by Solve.Care and other members of the team to build a truly scalable real-world DAO that the community can safely operate, govern, and expand. And I will share with you tremendous amount of information about all the other elements of the DAO—the community governance model, the Constitution, and everything else that goes into ensuring that this DAO can operate and scale. But today, the focus was to get you to understand the asset pools, the Asset Pool Index token, and the process flow that goes behind the life cycle of the asset pool, if you will. [I] welcome your thoughts and questions. Would love to hear back from the community what we might do to improve the DAO because ultimately it's your DAO, and this DAO will only come into life when the community governance is in effect. I want to make this point very clearly—that the DAO design and implementation is one thing, but the actual operation of the DAO will only happen when the community is in charge of this DAO. We are not going to allow the DAO to operate unless a community has taken hold of it and owns it. So the phase we are in currently is the design, development and improvement of the DAO framework, and coming up with a very powerful protocol that allows for merit-based project lending, that allows for real-life project lending, that allows for real-time project monitoring. Once that protocol is finished, then it is your job to turn this DAO into a global phenomena. I appreciate your time and efforts as always; your support—both the SOLVE community, the SOLVE token holders, as well as the Solve.Care team—very much appreciate all your support over the years. And I look forward to talking to you again as we dive into the other elements of the DAO. Have a great weekend, everyone! Watch this on YouTube: https://www.youtube.com/watch?v=rjsYThTQ6EY
This post is a master list of various digital decorating sites. Thanks to one of our members for contributing most of this information. If you are looking for Projection Tips and Advice, which references multiple other sites, please refer to our other post Projection/Projector Advice guide. This document will be review/updated as needed. Updated: September 30, 2020 ________________________________________ Digital Effects / Projections / Videos: ________________________________________ Online $tores / Retailers: The following retailers offer holiday digital displays for projection. From current CGI to the more "classic" type of actor style video & effects that pioneered the industry, they all offer something unique:
________________________________________ No malicious or illegal content will be shared in this post, and we do not condone or encourage any illegal activities in this post. ________________________________________ Conversion Tools / Software: < YouTube to Mp3 Online Converters > ytmp3.cc/en13/320youtube.com/v5/ offliberty.com/ < Software > 4kdownload.com/downloads
Will the Chicago Bears win OVER/UNDER 8.5 games? By University Stats Prof!
It was a roller-coaster ride for the Bears last year. They started with a 3-1 record before losing five of their next six meetings. They concluded the season by winning four of the last six games, but it wasn’t enough to qualify for the playoffs. After a NFC North title in 2018, Da Bears ended with a disappointing 8-8 record last season. The offense was often criticized (deservedly so), and changes needed to be made.
2. Offensive Position-by-Position Breakdown
2.1 Quarterbacks (QBs) Mitchell Trubisky has had an uncharacteristic journey in the NFL thus far. After being selected as the number two overall pick, he had a rookie season where he threw 7 TD passes versus 7 picks. He took a nice leap in his sophomore year with 24 TDs and 12 interceptions, while leading the team to its first division title since 2010. QBs showing such a nice growth from year 1 to year 2 rarely crash down the following season, but that pretty much describes Trubisky’s third year in the league. He graded as the 30th-best QB in the NFL out of 37 qualifiers based on PFF rankings. This situation was inexplicable. It’s not like the team had lost many key pieces on offense. What happened to Trubisky? GM Ryan Pace has set up nicely a good QB battle in camp between Trubisky and newly acquired Nick Foles. What’s interesting is Foles himself has had ups-and-downs in his career. He was outstanding in 2013 by throwing 27 TDs versus just 2 interceptions! He also led the Eagles to a Super Bowl in the 2017 season, after Carson Wentz went down to an injury. Foles also performed well in 2018. However, he wasn’t so good in 2014, 2015 and more recently 2019. What type of quarterback will he be in the windy city? Who’s going to get the starting nod? My own guess is Foles win the job early on. He is already familiar with the head coach, the QB coach and the offensive coordinator. Learning the playbook won’t be as difficult as if these guys had never worked together in the past. Backup QB Chase Daniel left for a division rival: the Detroit Lions. Overall, adding Foles over Daniel is clearly an upgrade over 2019, while also keeping in mind the fact that Trubisky may return to his previous form (which is not impossible for a young guy like him). 2.2 Running Backs (RBs) What the heck happened to Tarik Cohen? I have always liked small and fast guys. For this reason, he had become one of my favorite guys to watch. Watching him last year (and the entire offense) was sad. His yards per rush average went from 4.5 to 3.3. His yards per catch average went from 10.2 to 5.8. He couldn’t get going all season long. In 2017 and 2018, Jordan Howard and Tarik Cohen were a great version of the thunder-and-lightning combo. Despite losing Howard, the production wasn’t supposed to drop significantly because of the acquisition of David Montgomery through the draft. That’s not how things played out. The team went from 11th to 27th place in terms of rushing yards per game (from 2018 to 2019). Montgomery finished the year with a disappointing 3.7 yards per carry average. Both Montgomery and Cohen will be back in 2020. Perhaps they’ll do better this year, but I don’t expect a huge upgrade either. 2.3 Wide Receivers (WRs) Finally a guy that has produced consistent results in this offense: Allen Robinson! Catching 98 balls for 1,147 yards and 7 TDs despite such bad QB play was phenomenal! You can count on him to generate good numbers again, especially in a contract year. A former second-round pick, Anthony Miller caught 52 passes last season after catching 33 the year before. The only blemish was the number of TD receptions, which went from 7 to 2. Miller started the year slowly following an offseason injury that made him miss some time in camp. His role could be increased after the departure of Taylor Gabriel. The Bears pulled the plug on the Taylor Gabriel experiment. After showing some flashes with the Falcons, he never lived up to expectations in Chicago. Again, the production from this group may be steady in 2020. 2.4 Tight Ends (TEs) I’m sorry Bears fans, but one of the worst free agent acquisitions, in my humble opinion, was Jimmy Graham for two years and $16 million. The price paid versus the production doesn’t make sense at all. If you look at his numbers, you can see a clear decline. His first seven seasons were a success; his lowest mark according to PFF during that time span was 74.7. Then, he received a 66.0 grade in 2017. And then 59.6 in 2018, followed by 58.0 last year. To make matters worse, remember that the last two years were with the Packers, who happen to have a quarterback named Aaron Rodgers (have you heard of him?). Trey Burton was another huge disappointment last year. After catching 54 passes a couple of years ago, he only caught 14 in eight games. He was released and picked up by the Colts. The team drafted Cole Kmet in the second round in this year’s draft. He’s a classic tight end who can do a little bit of everything. He provides good run blocking, albeit sometimes a bit inconsistent. He doesn’t have that much experience as a pass catcher since he only started racking up decent stats last year, but he has a big catch radius. He will likely need time to develop into a solid starter. The Bears also have Adam Shaheen in their roster, a 2nd round pick from the 2017 draft. He has bust written all over him. As if they didn’t have enough tight ends, Chicago went on to sign Demetrius Harris, formerly of the Browns. He graded as the 66th-best tight end out of 66 qualifiers. Enough said. This group did very little last year. A bunch of six guys combined for 46 catches. Despite the questionable moves, I expect a small upgrade. Perhaps Graham can magically rejuvenate his career? 2.5 Offensive Line (OL) Four out of five starters are returning: Cody Whitehair, James Daniels, Charles Leno and Bobby Massie. Only Daniels graded as above-average; the others finished in the middle of the pack (or even lower). Kyle Long announced his retirement, while semi-starter Cornelius Lucas left for Washington. The new starter on the OL will be Germain Ifedi, who made at least 13 starts in each of his first four seasons in the league (all with the Seahawks). In summary, we have a not-so great starter being replace by a not-so great player. Therefore, we can expect similar results to 2019, which was average play. 2020 VS 2019 OFFENSE Inconsistency is a recurring theme for many players from this unit: Trubisky, Foles and Cohen. My final conclusion is a small upgrade over 2019, mainly because of the QB position. The chances are fairly good that either Foles provides a spark, or Trubisky regains his 2018 form. However, don’t expect a MVP-type of season for any one of them. The rest of the offense should expect similar output. Acquiring Jimmy Graham and Germain Ifedi is nothing to write home about, just as losing Taylor Gabriel isn’t a big loss either. Final call (2020 vs 2019): Small upgrade
3. Defensive Position-by-Position Breakdown
3.1 Defensive Linemen (DLs) The interior defenders did a fairly good job. Roy Robertson-Harris, Nick Williams and Eddie Goldman all graded as above-average DLs in 2019. Only Bilal Nichols received poor grades, but he played less often. Nick Williams left for Detroit, but the Bears expect to get Akiem Hicks in 2020. He suited up for just five games last year. He’s been a dominating force for them the previous three years. His return on the field will make a big difference. So, despite Williams’ departure, this group should do better in 2020 than the year before, mainly because of Hicks’ return. 3.2 Defensive Ends (DEs) / Edge Rushers (ED) Khalil Mack’s sack production went down in 2019 with “only” 8.5. He had recorded 12.5, 10.5, 11 and 15 in its previous four campaigns. Still, Mack finished as the #14 edge defender out of 107 guys. He is constantly disrupting plays from opposing offenses. The Bears lost Leonard Floyd who went to the Rams, but they quick found a replacement with Robert Quinn, coming over from Dallas. Floyd is two years younger and averaged 4.6 sacks per season, while Quinn has gotten 8.9 sacks per year over his nine-year career. Quinn is a better pass rusher, while Floyd plays the run better. All in all, I expect similar results as 2019 from this unit. 3.3 Linebackers (LBs) One more guy who saw a dip in productivity was Roquan Smith. After receiving a 67.0 grade in his rookie season, he only got 52.4 last year. He played the run well, but his coverage and pass rushing weren’t nearly as good in 2019. I do believe the former #8 pick overall can come back very strong in 2020. Danny Trevathan missed six games because of an injury, but he played pretty well when he was on the field. I am not worried about him. Backups Nick Kwiatkoski and Kevin Pierre-Louis both left in free agency. Both played very well while filling in for injured starters. Their losses take a blow to Chicago’s linebacker depth. 3.4 Cornerbacks (CBs) Kyle Fuller and Prince Amukamara were the clear starters in 2019. Despite finishing as PFF’s number 41 CB out of 112 qualifiers, Amukamara was released by the Bears for cap reasons. Still, the team needs to replace him. Can Buster Skrine or Kevin Toliver assume that #2 role? I’m not so sure about that… Chicago hopes to fill the void via the selection of Jaylon Johnson in the 2nd round last April. The number one concern about him is health; he has undergone through three shoulder surgeries over the years. Johnson’s speed and explosiveness are below average, but he makes up for it with great competitiveness and smart-play. 3.5 Safeties (S) We are rounding the defensive side of the ball with the safeties. Things were pretty simple in 2019, as both Ha Ha Clinton-Dix and Eddie Jackson played 99% of the defensive snaps. They ranked 19th and 46th out of 87 safeties, respectively, according to PFF. The problem is Clinton-Dix is gone to Dallas. Last year the Bears vacated the vacancy created at the safety position when Adrian Amos left for Green Bay by acquiring Clinton-Dix, but now that he’s also gone they have a glaring hole at the position. 2020 VS 2019 DEFENSE The Bears allowed the fourth-fewest points in the league last season. Can we expect a similary good 2020 season? I doubt it. First, the good news. Akiem Hicks is back from an injury that made him miss 11 games and the team acquired steady sack producer Robert Quinn from Dallas. The bad news? Losing DL Nick Williams, DE Leonard Floyd, LBs Nick Kwiatkoski and Kevin Pierre-Louis, CB Prince Amukamara and S Ha-Ha Clinton-Dix. That’s a lot of bodies that need to be replaced. We’re talking about at least 4 new starters and some key depth. Overall, my guess is it takes a moderate blow to the Bears’ defense. Their front seven is likely to remain very good, but the secondaries worry me. I wouldn’t fall off my chair if the team went from 4th-best in points allowed to the 10th-12th range. Final call (2020 vs 2019): Moderate downgrade
4. Regular Season Wins
According to sportsbooks, the Chicago Bears are expected to win 8.5 games this season. Should we bet the “over” or the “under”? Here is the methodology I used in order to answer this vital question:
Use BetOnline.ag’s point spreads on all 256 regular season games.
Convert those point spreads into win probabilities.
Simulate each of the 256 games, according to those win probabilities, via the R statistical software.
Repeat the previous step one million times (you get 1M simulated seasons).
Count the proportion of seasons where the Bears won more or less than 8.5 games.
Here are the results:
OVER 8.5 WINS
UNDER 8.5 WINS
Tip: Bet UNDER 8.5 wins
Return On Investment (ROI): +9.7%
Rank: 26th-highest ROI out of 32 teams
Minimum odds required to bet (i.e. ROI = 0%): -163
Here are BetOnline’s point spreads for the Bears’ 16 regular season games:
HOME: -5 vs DET, 0 vs GB, -3.5 vs HOU, -1.5 vs IND, 0 vs MIN, +2.5 vs NO, -5 vs NYG, -1 vs TB.
Note: The “Best odds” from the table above were obtained after looking at 13 well-known online sportsbooks on May 18th, 2020. TOMORROW: I'll talk about the team whose ROI is the 25th-highest in the league, the Green Bay Packers! I hope you found this article insightful, thanks for reading! Professor MJ
Will the Minnesota Vikings win OVER/UNDER 8.5 games? By University Stats Prof!
The 2019 season had mitigated success for the Vikings. They secured the #6 seed in the NFC before pulling a huge upset win in overtime in New Orleans. The offense completely stalled the following week in San Francisco, though. The franchise has not gone through a losing season in five years. Head coach Mike Zimmer has really done a good job. Can the team take a forward leap and make it further into the playoffs? The team has not made it to the Super Bowl since 1976.
2. Offensive Position-by-Position Breakdown
2.1 Quarterbacks (QBs) Kirk Cousins has received more criticism than praise since signing a huge contract with the Vikings a couple of years ago. Yet, the team posted an 18-13-1 record and Cousins has thrown 56 TD passes versus 16 interceptions. His completion rate has been excellent over those two years: 69.7% (among the best of his career). The reprimand concerned more the lack of playoff wins than the level of play itself. He cleared a hurdle by leading his team to a big playoff upset in New Orleans last season, thanks to a 4-year TD pass to Kyle Rudolph in overtime. However, he followed it up with a horrific performance in San Francisco. Don’t be misled by his 21-of-29 passes completed during the game. Minnesota flirted with the postseason record for fewest first downs in a game; they only got 7 and totaled 147 yards of offense. Still, based on PFF grades, 2019 was Cousins’ best career season. He ranked as the #6 QB in the league with an 84.1 mark. Sean Mannion will once again back up Cousins. He’s clearly not the best #2 quarterback in the league. Cousins has been extremely durable throughout his career, and the Vikes hope it stays that way. 2.2 Running Backs (RBs) In my 2019 NFC North preview, I mentioned how I believed Dalvin Cook was one of the most underrated players in the league. He had only rushed for 354 and 615 yards in his first two seasons, but he had passed my eye test. I knew that, barring injuries, he would breakout as one of the top backs in the league. He did enjoy a nice 2020 season with 1,135 rushing yards and 519 receiving yards, while racking up 13 touchdowns. Two things raise some concerns about him, though. First, his lengthy injury history. He seems to get nicked up often. Secondly, his play tailed off quite a bit towards the end of the season. During the first eight games of the season, he rushed 156 times for 823 yards, which was good for a lofty 5.3 average. However, over his final six meetings (including the playoffs) he carried the ball 84 times for 256 yards, a meager 3.0 average. After being selected in the 3rd round of the draft out of Boise State, Alexander Mattison showed promise in his first year as a pro. He had 100 rushing attempts for 462 yards, a nice 4.6 yards-per-carry average. It will be interesting to see if he can carry the load if Cook goes down. 2.3 Wide Receivers (WRs) The Vikings had one of the most talented WR duo in Adam Thielen and Stefon Diggs. They caught 113 and 102 passes, respectively, during the 2018 season. Those figures regressed to 30 and 63 last year. Thielen only played 10 games, but he was still on pace for just 48 receptions. What was the problem? In 2018, Minnesota had the 6th-highest number of passing attempts. In 2019, that rank dropped to 30th ! That being said, the team traded Diggs to Buffalo. He expressed frustration with Cousins and they didn’t seem to be on the same page. In order to compensate for that loss, GM Rick Spielman signed Tajae Sharpe, formerly of the Titans. He will fight for the #2 role opposite Thielen. The former fifth-rounder posted decent numbers in his first three years in the league. He used to be a starter, but his playing time got cut after Tennessee drafted A.J. Brown and signed Adam Humphries. Sharpe seems to be destined to be a #2 or #3 receiver in the NFL. The team also has high hopes for first-round rookie Justin Jefferson. He was very productive at LSU and he ranked second in 15+ yard receptions over the last two seasons (only Jerry Jeudy beat him). He wasn’t spectacular as an outside target, but he had a monster season playing in the slot last year. He’s great with contested catches and has a good shot to become an immediate starter. Bisi Johnson took advantage of Chad Beebe’s injury to grab the number three role last year. The 7th round rookie posted a 31-294-3 stat line, which was “okay”, but he seems like a long shot to become a true starter. The team finally pulled the plug on the failed Laquon Treadwell experiment. He’s been nothing short of a disappointment since being the #23 overall pick in the 2016 draft. He signed a contract with the Falcons in the offseason. 2.4 Tight Ends (TEs) The Kyle Rudolph – Irv Smith combo is very solid. Both guys played all 16 games with Rudolph recording slightly better numbers. He hauled in 39 passes for 367 yards and 6 TDs, while Smith’s numbers were 36-311-2. Rudolph made some highlight reel catches, his most important one being the game-winning TD catch in overtime in New Orleans. Smith is expected to expand his role in the offense with one year of experience under his belt and Diggs off the team. He showed very nice potential despite the Vikings relying very often on the running game. 2.5 Offensive Line (OL) This unit allowed the sixth-fewest sacks in the league, but that wasn’t necessarily a great accomplishment given the offense ran the ball so often. Overall, this is an average, or slightly above-average, offensive line. Here is a rundown of each starter’s PFF rankings:
Garrett Bradbury, 30th out of 37 centers;
Brian O’Neill, 33rd out of 81 tackles;
Riley Reiff, 38th out of 81 tackles;
Pat Elflein, 39th out of 81 guards;
Josh Kline, 26th out of 81 guards.
Bradbury and O’Neill were the youngest guys as first- and second-year players. It’s worth noting that O’Neill definitely improved the quality of his play from year one to year two. Riley Reiff was a candidate for release considering his big contract, which is not in sync with his on-field performance. He’s clearly not among the top left tackles in the league. After an atrocious 2018 season, Pat Elflein did better last year. He is in his mid-twenties and should remain an adequate starter (albeit, not a great one). Josh Kline was let go by the Vikings, possibly because of cap reasons and the fact he was now on the wrong side of 30. Still, this is a bit of a surprising move given the team’s lack of depth. 2019 fourth-round pick Dru Samia or career journeyman Dakota Dozier will be fighting for Kline’s spot. The Vikings selected a late riser in the second round of this year’s draft: Ezra Cleveland. He played over 95% of the snaps in three years with Boise State. He is mobile and very athletic. He seems like Riley Reiff’s heir apparent (who seems likely to be released next offseason). 2020 VS 2019 OFFENSE The starting lineup remains fairly intact with 9-of-11 starters returning. The QB, RB and TE positions should provide similar production in 2020. The WR position took a hit with the loss of Stefon Diggs, a very dangerous playmaker. He was among the best in contested catches. Acquiring a borderline starter like Tajae Sharpe won’t be enough to replace him. Let’s hope rookie Justin Jefferson can have an impact right away. On the offensive line, Bradbury and O’Neill may take a leap given their young age. However, Josh Kline leaving the team is hardly good news. Accordingly, I expect Minnesota’s offense to fall a little bit. Offensive coordinator Kevin Stefanski left the team to take over as Cleveland’s head coach, but the system will remain the same under new OC Gary Kubiak. The latter oversaw the offense from the coaches box last year, so the transition shouldn’t be too difficult. Last year, the Vikings offense scored the eight-most points in the league, and I predict this year’s ranking to lie between the 10th and 16th spot. Final call (2020 vs 2019): Small downgrade
3. Defensive Position-by-Position Breakdown
3.1 Defensive Linemen (DLs) The best interior defender for the Vikings was clearly Linval Joseph. Unfortunately, the cash-strapped Vikings had to release him. A few days later, Minnesota signed Michael Pierce. The former Raven performed at a very similar level as Joseph, but he is four years younger. The run-stuffing nose tackle’s acquisition has to be viewed as a bit of a positive for the Vikings defense. The other guys seeing time on the interior of the defensive line aren’t very good. Both Shamar Stephen and Jaleel Johnson finished way below-average according to the PFF grading system. 3.2 Defensive Ends (DEs) / Edge Rushers (ED) The Vikings had one of the most fearsome DE duo with Danielle Hunter and Everson Griffen. They racked up 14.5 and 8 sacks, respectively. Casual fans probably know who Danielle Hunter is. But they don’t realize how good he is; he doesn’t get enough credit, possibly due to playing in a smaller market. If you look at the numbers, he’s been a beast. Did you know he became the youngest player in NFL history to reach the 50-sack mark? He picked up 14.5 sacks in each of the past two years, and he has averaged 10.9 over his five-year career. The former LSU player was a steal in the 3rd round of the 2015 draft! Everson Griffen is getting older at 32 years old. At the time of writing, he has yet to sign with a team. He is very likely to find a suitor, but all signs point towards him leaving Minnesota. That will leave a void for sure. Griffen has averaged 8.8 sacks during the last eight seasons. He has spent his entire 10-year career with the Vikings and has missed very few games. He’s a true warrior. So, who will take Griffen’s spot? Stephen Weatherly was a key reserve for the team last year, but he left for Carolina. Is Ifeadi Odenigbo ready to pick up the slack? He came out of nowhere to record seven sacks last year! After being chosen in the 7th round of the 2017 draft, Odenigbo barely played any snaps in his first two seasons. I seriously doubt he can be the long-term answer and I believe last year’s seven sacks were an outlier. 3.3 Linebackers (LBs) Eric Kendricks had one of the most improbable seasons last year. His PFF marks had varied between 59 and 69 since entering the league five years ago. Then, he earned a jaw-dropping 90.1 grade in 2019, which put him as the second-best linebacker in the entire league. He didn’t do much as a pass rusher, but he was great defending the run and covering people. Anthony Barr had a subpar year and finished as a below-average LB according to ProFootballFocus. He’s been a steady producer, but his grades have been all over the place during his six-year career. He received his second-lowest mark in 2019. Eric Wilson has been a reserve player since joining the league in 2017. He’ll likely have a similar role in the upcoming season. He did show adequate skills last year. 3.4 Cornerbacks (CBs) Wow, a lot of shuffling has taken place with Minnesota’s secondary during the offseason. Both 2019 starters, Trae Waynes and Xavier Rhodes, are gone to other teams. And their primary slot corner, Mackensie Alexander, also signed with another squad. Ouch. Waynes never played at the level of a #11 overall pick, but he yielded steady play during his five-year stint with the Vikings. His PFF grades have been very consistent year-over-year, and he repeatedly finished slightly above-average among all CBs. He’s the guy the team will miss the most. Let’s face the reality: Xavier Rhodes was one of the worst corners in the league last year. His play took a big hit in 2018, and things got even worse in 2019. He really needed a chance of scenery; perhaps joining the Colts will rejuvenate his career. As for Mackensie Alexander, I feel like the team should have tried harder to keep him. His first two seasons were difficult after getting drafted in the 2nd round out of Clemson, but his last couple of years were much more promising. He could have rendered some valuable services to a team that had just lost its two starters. The door is now wide open for 2018 first-rounder Mike Hughes. The jury is still out on whether he can assume a starting role in the NFL, but I guess we’ll find out very soon. The Vikings decided to address the glaring hole at the position by selecting Jeff Gladney with the 31st overall pick in this year’s draft. Gladney is a sound tackler and a good blitzer too. He was a four-year starter out of TCU, where he was one of just two players with at least 15 passes defended in each of the past two years. However, he has a lengthy injury history. 3.5 Safeties (S) Harrison Smith is a perennial All-Pro safety. He’s been racking up tackles and interceptions throughout his eight-year tenure in the NFL. Averaging close to 3 picks per season over such a long span is impressive. Talk about defying the odds. Anthony Harris went undrafted five years ago. Fast-forward to today, and he’s received 89.0 and 91.1 grades from PFF the last two seasons. He finished as the top safety in the NFL out of 87 qualifiers. In other words, the Vikings may have the best safety duo in the NFL. The only bad news is they lost depth when both Andrew Sendejo and Jayron Kearse left via free agency. 2020 VS 2019 DEFENSE Replacing Linval Joseph with Michael Pierce is a small gain for this Vikings defense, in my opinion. That’s about it for the good news for this unit. Stud pass rusher Everson Griffen seems destined to leave the team, and one of their main backups, Stephen Weatherly signed with the Panthers. At linebacker, I don’t mean to be a party pooper, but Kendricks is very unlikely to match his 2019 performance. He had been an average LB for four years; I doubt the switch suddenly went on and that he will keep being a top 5 linebacker in the NFL. Last year’s top three CBs are gone, as well as two backups at the safety position. As of now, the team hasn’t signed any free agent to replace them. This is not a surprise, considering the bad cap situation the team is in. They drafted a few guys, including Jeff Gladney late in the first round, but their impact remains to be seen. Many new faces on defense, plus a drop in talent invariably equals a big downgrade. The team allowed the fifth-fewest points in the league last year; they’ll be lucky if they finish above-average in 2020. Final call (2020 vs 2019): Big downgrade
4. Regular Season Wins
According to sportsbooks, the Minnesota Vikings are expected to win 8.5 games this season. Should we bet the “over” or the “under”? I'll answer this question via two different methods. 4.1 Professor MJ's Prediction I won't go into the mathematical details, but here is a summary of my own personal pick (based on my analysis above and my estimated spreads for the Vikings' 16 games): OVER 8.5 WINS
Estimated Probability: 72%
Best Odds: -121 (DraftKings)
UNDER 8.5 WINS
Estimated Probability: 28%
Best Odds: +139 (Pinnacle)
Tip: Bet OVER 8.5 wins 4.2 Based on BetOnline's Point Spreads Here is the methodology I used here:
Use BetOnline.ag’s point spreads on all 256 regular season games.
Convert those point spreads into win probabilities.
Simulate each of the 256 games, according to those win probabilities, via the R statistical software.
Repeat the previous step one million times (you get 1M simulated seasons).
Count the proportion of seasons where the Vikings won more or less than 8.5 games.
Here are the results: OVER 8.5 WINS
Estimated Probability: 63.1%
Best Odds: -121 (DraftKings)
UNDER 8.5 WINS
Estimated Probability: 36.9%
Best Odds: +139 (Pinnacle)
Tip: Bet OVER 8.5 wins Here are BetOnline’s point spreads for the Vikings’ 16 regular season games:
HOME: -6 vs ATL, -9 vs CAR, -4 vs CHI, -2.5 vs DAL, -7 vs DET, -3 vs GB, -11.5 vs JAX, -3.5 vs TEN.
My journey to completely beat stringing on the printer I gave up on.
2 years ago youtuber RCLifeOn made a glowing review of the Tevo Tarantula. I was interested in buying a cheap 3D printer as a way to dip my toe into the water. So I bought it. It was an absolute piece of sh*t with a wobbly bed. I modified It heavily - converted it over to direct drive with an E3D v6 + titan clone and it served me ok-ish for the last year. Recently I bought a Prusa i3 MK3S with the intention to replace the old machine. But before I decided to throw it away I gave it one more chance. The main issue was copious amounts of strining. The goal was to solve these stringing issues without investing a lot of money (i'd rather buy a Prusa Mini). Here is a complete list of *everything* I had to solve to get it working:
Tighten all screws
Extruder E-steps/mm calibration
Hotend thermistor calibration
Mounted Genuine E3D V6 Nozzle
Upgraded to latest Marlin firmware (from nearly 2-years old nightly to latest bugfix release)
Validate the hotend temperature (using an external k-type thermistor for example)
Validate that you're testing with good filament. Not all filament can even print without stringing. Prusament is a good for testing.
Failure to do this up front will invalidate any previously made testprints. I speak from experience I did all my tuning in Prusaslicer. Here are some findings Retraction speed There is quite a broad range of retration speeds that work well. This shouldn't be your first parameter to tune. Default should work fine. Retract amount before wipe 0% and 100% both didn't seem to effect the print. I did see a big improvement when I went to 50%. But ultimately I concluded that the "Retract amount before wipe" is an option you can use to mask stringing issues but it doesn't solve the underlying issues. In my final profile I've set this back to 0%. What it did show me is that the printer is not managing the extrusion pressure as it should and it lead me to tune linear advance and, when this did not work out, finally upgrade the Firmware Retraction length A direct drive extruder really doesn't need more than 1mm of retraction. When I noticed that a retraction of 1.5mm improved the output this was a red flag. See next section. Extrusion multiplier I've gotten used to print my printer at 0.79 extrusion multiplier. When I saw that the Prusa had a multiplier of 1 I finally figured that something wasn't right with my esteps/mm calibration of the extruded. I calibrated my esteps/mm until I had a wall width of 0.45mm while using Prusament. Having a wrong esteps/mm on the extruder means that you're extruder is not retracting as much and as fast as configured. Hotend temperature My printer was lying to me. When it reported 215 degrees celcius it was actually printing ar 225 - 230 degrees. A huge difference which had a big impact on the result. I used and external k-type probe to compare the nozzle of my shitty printer with the Prusa and saw a big difference. I solved this by finding the original listing of my E3D hotend clone on aliexpress and find the specs of the thermistor. Then I made sure i selected the right thermistor in de marlin configuration.h file. (which was set to the official E3D thermistor, but this clone was using a different one) Make sure you do a new hotend PID tune when chancing this. Firmware update This was one of the biggest and most unexpected improvements. Marlin greatly improved on linear advance in the last 1.5 years. Upgrading the firmware removed the last bit of stringing I had left. I can even use the same k-factor values as the Prusa (which isn't a goal, but makes setting up printing profiles easier). Printer profile I found that I could copy the Prusa printer profile and work from there. It was a better starting point than the default profile Prusaslicer gave me. But your miles may vary. It does help to compare your printers profile with the Prusa profile and understand the changes you see. Sorry this post was so long! I don't have the time to make it shorter. I hope it was helpful.
And now: the news for all interested parties and pre-order customers.
For the launch quite a lot of expert options in the BIOS were announced, with which among other things also Undervolting should be possible. A sneak peak was shown in the video of der8auer. Unfortunately we have to admit that due to security/warranty concerns of our ODM we have not yet received approval for the release of such an expert BIOS. The now delivered BIOS is nevertheless an XMG-exclusive customization with the following, additional features:
Improved fan tables for Quiet, Power Saving and Entertainment
Performance profiles can be set in the BIOS (not only in Software)
Toggle of WLAN, Bluetooth, audio/microphone and webcam is possible
Wake on LAN optional
Deactivation of Boot Logo optional
For the release of a more generous expert BIOS (possibly under the condition that no guarantee is given) we have to renegotiate with the ODM. It may be necessary to consider a few options or set reasonable limits. The motherboard does have mechanisms for safety shutdown in case of overload. Nevertheless, we have to avoid that users who like to experiment drive their voltage converters and other components too hard at the limit in the long run and thus give us an increased RMA rate. We have to ask for your patience at this point. For now, we want to get production running as smoothly as possible before we start to make further optimizations. Which brings us to the next point:
SO-DIMM DDR4 memory and AMD Ryzen on AM4
At the beginning of serial production of XMG APEX 15 with a larger number of RAM memory modules, we unfortunately still noticed incompatibilities not yet having been detected in sample testing. This was due to the fact that due to logistical bottlenecks we only had a very limited number of samples available before serial production started. Although our ODM has been validated diligently in large quantities, the results of this validation could apparently not be transferred 1:1 to the serial production. Standard tests like Furmark and Prime are not affected by this, at least not in default settings. The CPU stress test Prime95 must be set to the preset "Large FFT" in order to provoke errors on the memory controller. Another alternative to Prime95 is the stress test in AIDA64, which is also a particularly high load on the memory controller. In many cases, errors only appear when there is a fluctuating, high graphics load at the same time. For desktop PC enthusiasts all this may not be particularly groundbreaking. However, we would like to point out here that with XMG APEX 15 we have to use SO-DIMM RAM. AM4 with Ryzen 3000 series and SO-DIMM is virtually uncharted territory. Unfortunately, the best practices and RAM recommendations from the desktop DIMM cannot be transferred 1:1 to the SO-DIMM modules available on the market. All of these statements refer to dual channel operation. In single channel operation the system tolerance seems to be higher - but in terms of the performance of the AMD Ryzen CPU, Dual Channel is clearly recommended.
New RAM compatibility list
After all these tests, we have now consolidated the list of compatible modules:
Corsair Vengeance CMSX8GX4M1A2666C18
Kingston ValueRAM KVR26S19S8/8
not available in EMEA
validated by ODM, not tested by us yet
The list might be extended over time. The FAQ in the Megathread will be kept up-to-date accordingly. Especially with the 8GB modules with max. 3200MHz we haven't had any problems so far. So you can assume that other (not listed here) 8GB modules with up to 3200MHz should also work properly, although it is not guaranteed. However, caution is advised with modules from 16GB capacity. Modules which are not on this list might run, but under certain load conditions they might cause some crashes. These include:
Windows reboot loops
Switching off the screen (Black Screen)
Anyone wishing to upgrade their XMG APEX 15 and exclude any instability should therefore exclusively use modules from the above list. We will of course be glad to morally support operation with modules not listed here (higher clock rates, lower latencies), but we cannot give any guarantees.
Order changes for various memory options
Some of the currently open pre-order customers will receive an info mail with some changes:
Orders with 16GB modules (2666) from Corsair Vengeance change to 16GB modules from Samsung (same price)
Orders of 16GB modules (2666) from Kingston ValueRAM also switch to 16GB Samsung (free upgrade)
Orders with 16GB modules (3200 CL22) from Crucial will have 4 alternatives to choose from, sorted by type and potential waiting time
The cases of advance orders in category 3 (with the module Crucial 16GB DDR4-3200) are represented numerically quite frequently. We plan to offer those customers the following option:
Upgrade to Samsung 16GB 3200 CL19
Still unclear, at least 2 weeks
Different memory brand, same specs
up to 2 weeks
Switch to DDR4-2666
Switch to 2x 8GB DDR4-3200
We are still working on the fine details and we hope to send out some e-mail to pre-order customers tomorrow.
Keyboard Stock and Shipping Update
There has been serious logistical bottleneck for all keyboards with non-German print for this model. We will receive an express shipment of new keyboards by the end of next week. This should hopefully resolve the waiting time for all international pre-order customers. We will receive addtional supply in middle of May, so feel free to place your pre-order now. Meanwhile, we have already been shipping units to German pre-order customers this week (of course only with validated and properly tested memory) and the first user reviews are trickling in. Here is one from this morning in the German forum of Computerbase. We have also shipped review samples to multiple press outlets in U.S., UK and Germany. Thank you for your patience and understanding while we start ramping up XMG APEX 15. We are looking forward to your feedback! // Tom
REMINDER--SUPPLY & DEMAND DO NOT EXIST--Marshall, Walras, Friedman, Hayek GTFO (I.e. a primer on economic critique. Here is a summary of lots of esoteric & pedantic economics developments in the last 100 years spread across OP & Comments)
One thing I'm always concerned about is if people give the economics view point too much credit. To this end, I am going to give a quick little primer on some of the more esoteric, but important, debates in economics and why they matter to us. What I want to point out is that the state/market, equity/efficiency, command/exchange, rational/irrational, theory of choice/theory of causes, 'in theory'/'in practice', supply/demand, politics/economics distinctions do not actually hold. But, MOST IMPORTANTLY, SUPPLY AND DEMAND DO NOT EXIST! I am going to start with classical economics, go to some outside critiques and finally in the last section, take on neo-classical. I have a reason for doing it this way. BIG NOTE: I am going to continue this in COMMENTS, THIS IS NOT THE COMPLETE CRITIQUE Preliminary observations on the two foundational bases of Supply & Demand: Classical Theories For various related reasons, the old school Labor Theories of Value (whether Smith, Ricardo or Marx's) do not hold in practice, nor do the classical theories of the center of gravitation, Malthusian theory of population or the Tendency of the Profit rate to fall. What the old school theories do have an advantage on is a focus on production over exchange, reproduction over statics, scale & scope over constant returns, distribution over efficiency, the role of rents & land, the role of politics & so on. Thus, although commodities cannot be reduced to their time-dated basis in labor, nor will market prices converge to natural prices via some classical center of gravity, not will long run population & rents eat up marginal profits, the Classical Theories are better, in many ways, than modern ones. http://gretl.ecn.wfu.edu/~cottrell/ope/archive/0709/att-0111/01-GravMec_pdf_.pdf http://www.ssoar.info/ssoabitstream/handle/document/29053/ssoar-jebo-2009-2-sinha_et_al-sraffas_system.pdf?sequence=1 http://ricardo.ecn.wfu.edu/~cottrell/ecn265/Principles.pdf That said, I am not here to argue and will clarify anything I post, as this is very esoteric stuff, but I will not argue with either hardcore neoliberal types NOR with extremely online defenders of the Labor Theory (because after all, I think two key insights of it are correct). There are several reasons for this. Smith's theory of value was as follows: every commodities 'natural price' is the natural price of the Wages + Profits + Rents which went into it i.e. Commodities Z = W + X + Y and W = P + Q, X = R + S, Y = T + U, so then Z = P + Q + R + S...etc. However, at no point will they reduce to zero, where will always be a commodity residue--which is to say nothing of his Diamond Water Paradox. Ricardo saw it all as a Cost of Production, based in the value of labor--however, a problem emerged: when he figured it this way, the size of output became dependent on its distribution. Marx's innovation was to see it as a series of reproducing matrices of production, with labor as the long term valuation metric. The problem is that, for this to be true, labor's proportion in production has to be proportional to its output of production--i.e. the organic composition must be labor proportional--which Marx, himself, acknowledged it wasn't (and furthermore, he saw competition of capital between as what equilibrated them!). Furthermore, labor's output, due to scale & scope & other such things, often depended on the whole of other production--the rate of exploitation could not be determined ahead of time. Another issue was that Marx, himself, also acknowledged that rents (intensive + extensive + absolute) could play a role in the long run determination of value not just price. Later, Morishima showed that the Tendency of the Profit Rate to fall is false because the installation of labor-reducing, capital-using technology will always leave wages/profits as high as or higher than they were IF they do not involve an endogenous change in social structure or power. Thus if the new capital increases owner control it may reduce output and thus profits, but otherwise it will not. http://scholarworks.umass.edu/econ_workingpape63/ https://zcomm.org/wp-content/uploads/zbooks/htdocs/books/2/2.htm https://www.pdx.edu/econ/sites/www.pdx.edu.econ/files/PSUSvM.11182016_Hahnel_Nov18_2016.pdf Furthermore, Marx's law only worked in the static case anyway, excluding if there was innovation, learning by doing, human capital accumulation, static returns to scale, scope & specialization, the creation of new markets (either through products or imperialism), all of which prevent a TPRF. Monopoly, rising rents, rising worker OR employer power, state control, externalities & diseconomies of scale CAN lead to falls in profit rates, but only one at a time. I.E. unless these constantly or proportionally rose, they'd lower the profit rate once, and then adjustments would simply happen within that margin. Also, if capital is not mobile due to entry & exist costs or monetary issues, then the competition of capital which would lower its rate doesn't exist. Thus, even in the static case, scale, scope, competition, monopoly, rents, money & so on need to be accounted for. Thus the profit rate conditionally falls (equalizes) in the static case, but will not fall in the dynamic one, absent perfectly countervailing social changes. Marx's distinction of labor power from labor, of the importance of reproduction & of the importance of distribution must all be kept & acknowledged. Finally, quickly on Malthus & the Commons. For Malthus, effectual demand determines current output, and in the long run, it is demographics which equilibrate. As wages rise, so does childbirth. Eventually, he argues, because population grows geometrically & food algebraically, the former will outstrip the latter. In the meantime, infant mortality, sanitation & life expectancy will adjust these to output. The problems are this:
This is only true where all land is already used up such that marginal rents are pushing against profits & wages--spare land, as long as it is not to costly to access, will prevent this process
Population doesn't grow geometrically. Development & birth rates display an inverted-U shape. The fewer children who died, the longer people live, the more literate people (especially women) are, the more open the access to contraception, sex education & abortion, the higher people's incomes, the stricter the restrictions on entry & exit to the labor market & the strictness of social, moral, cultural & religious frameworks all reduce rates of childbirth.
Food doesn't grow algebraically--even before industrialization, there were varying rates of scale. Food production displays constant returns to scale in the long run, but will display increasing returns to scale, statically, due to industry, unit costs, transportation, storage, variation & complementarities & in the long run due to innovation, learning by doing, training, investment, crop breeding & so on.
Furthermore, industry leads to rising agricultural output because the least efficient agricultural workers go to industry, thus leaving the most efficient agriculture & the most efficient industry & these two become a virtuous cycle.
Thus, where there is available land, static & dynamic returns to scale in industry and/or agriculture, static/dynamic food production, non-geometric population growth, endogenous social change, Malthus does not apply. Finally, the commons does not deteriorate simply from use. First of all, there are several kinds of commons goods. The social & intellectual commons, by definition does not deteriorate. Second, where there are network effects, due to productive consumption, like in social capital, grids, social networking & so on, use increases productivity. Third, though, where the good is fixed, either in use, or, worse, deteriorates, it's collapse depends on the following:
Throughput, the rate of resource intensity, has to stay static, grow slower or fall relative to productivity growth. If throughput efficiency stays at or rises above productivity growth, then resource use will remain fine.
Using a resource doesn't entail complementarities which reduce harm--so, for example, some crop rotations prevent each others deleterious effects--this is related to (1), but on a static scale
There must be static or rising returns to exploitation (if there is satiation in consumption or a glutted market, people won't take more!) of the resource.
(a). People must be fully informed of the limits or sufficiently asymmetrically informed to know to exploit it. Where information is poor, such that people don't realize the advantage gained from exploitation, they will not do so. (b). The same goes for rationality, boundedness & foresight. In other words, if people are fully informed, totally uninformed, or asymmetrically, but sufficiently informed and/or totally rationally, totally irrationally or asymmetrically but sufficiently rational, then, yes, it can occur. BUT if people are not fully informed or uninformed, rational or irrational & are otherwise equal in info & capabilities, it will not occur.
Similarly, people must be intrinsically selfish & rational. They must have a single set of rational preferences--they cannot switch between kin/political/powemoney logics, cannot have preferences or preferences etc. As soon as the latter emerge, countervailing processes may emerge. That said, the tragedy can still occur under alternate preference regimes, but rationality is a necessary condition of it necessarily happening (lol)
Institutional, coercive/cooperative, communal processes cannot exist to prevent it. If people have moral mechanisms, common habits, political institutions, property, coercion/cooperation, it can be prevented. This is often the argument for privatization, but, for privatization to prevent resource depletion would require (a). that the costs of deterioration are born fully by the owner, that (b). they have sufficient information to prevent it, that (c). the availability of throughput reduction investment exists, (d). (optional) they feel some social obligation or are regulated & (e). that, even without that, the price of finite resources rises proportional to total stock, multiplied by the interest rate (the Hotelling rule)--in other words, the shorter term people think, the more costly resources must be, given the total stock, in order to prevent their exhaustion--the optimal price for declining resources balances total stock & total time preferences.
If information is sufficiently incomplete, asymmetric, or fundamentally uncertain it will undercut it
If Transaction, bargaining, menu, contracting costs are too high either in monetary or temporal terms, both absolutely (efficiency) or relative to one of the parties (equity)
If the goods & bads are undefinable or too diffuse, they will be impossible to monitor--similarly if it's costly to enforce or research the costs & contracts it will fail
If there are too many parties to the goods, the bargaining costs sky rocket
If bargaining is sufficiently non-finite or constantly re-negotiable over the long run, it will fail (after all, there are multiple overlapping generations, who care about their kids & the future, and states/corporations which persist after them)
If it is not incentive compatible--so if information & control fall on the same lines or if it incentives lying about impact or cost--it will fail
If there are reputation, discovery, revelation, commitment or trust costs (or moral ones), this can make it indeterminate.
Where info is asymmetric/incomplete/uncertain, then investors can simply make poor decisions
Where entry/exist, scrap/sunk costs are high, then investment will be immobile, or worse, if wrong, costly to undo
Where there are dynamic externalities, then there will be undeover provision of socially beneficial/harmful investments
Where conflict/control over workers is a commodity or there are ways to extract rents, via marketing/fraud etc, then in investors will invest in guard labor, control & marketing stuff, which reduces output (but increases profit share).
Where the profit rate is sufficiently high, then investors may invest in capital-reducing/labor-using technologies, which actually reduce net output & then, therefore, undercut positive effects of investment. This is a feature of reswitching/reverse capital deepening. Thus where Morishima's law shows that all capital-using investment increases output, nonetheless, investors may reduce output by investing in capital-reducing!
Similarly, where land & money are options as investments, investors will substitute them for labor & capital at the margins (due to tax policy, liquidity premiums, land speculation, rent/use rights, cheap credit & whole other things), thus reducing net output.
https://www.researchgate.net/publication/306003064_A_Tale_of_Three_Theorems https://www.researchgate.net/publication/46509834_Misinterpreting_the_Coase_Theorem https://dlc.dlib.indiana.edu/dlc/bitstream/handle/10535/10024/614-4990-2-PB.pdf?sequence=1&isAllowed=y http://www.masongaffney.org/publications/I6A-1996_Taxes_Capital_and_Jobs_1978_revised.pdf http://delong.typepad.com/kalecki43.pdf This means that the Invisible Hand's first law--of static allocation--is true in theory, but, by its very nature, implies necessary empirical features which undercut, making it a pragmatically self-contradictory law. Add to this any number of contingent, but highly common empirical facts, and it's basically done. This means that the second law fails for many of the same (both necessary & contingent) empirical reasons as the first law BUT also fails in theory for two other reasons: 1. The existence of re-switching/reverse capital deepening with a high enough profit rate and 2. the substitution of land & money for capital/labor for any number of reasons! Say's law collapses for many similar reasons. Say's law basically says that supply creates its own demand. Produce and price will fall to equilibrate quantity & demand. The issue is that, yes, supply = demand in national income calculations, but it does not do so ex ante AND post hoc, only the latter. In a barter economy, supply must equal demand, people say. Now, in some sense, this is trivially true, but where there are heterogeneous capital AND consumer goods and no way to convert one readily into the other (without labor), and where consumecapital goods are costly to store, measure, transport etc. or are highly perishable, then there will be time issues of exchange playing into the necessary transfer between these. Thus, in a sense, all saving will equal investment in a Barter Economy, but the heterogeneity of capital goods, and their use in production & consumption, means that such saving & investment bears no relation to what we consider those things to be today, and that the empirical properties of storage/perishability & discount rates means that this doesn't assure optimization either. Thus, even in the Barter context, Say's law is doubtful. That said, as an empirical fact, money/credit economies precede Barter ones. Credit is as old as civilization, with options & so on as early as the Mesopotamians. All observed examples of barter take place in the social context of policy and confinement (and even these used a central state function to mint money!). https://www.unc.edu/~salemi/Econ006/Radford.pdf Otherwise credit is key. Credit precedes coin & currency in time, but they are all money. Credit/money will exist wherever:
There is an external method to certify trust--especially if, said method, is (a). standardized, (b). dischargeable/exchangeable, (c). store-able, but this sort of jumps the gun
There is a need for tax tokenization (a). to induce labor and/or (b). to avoid issues of perishability of in-kind goods and/or (c). avoid issues of measuring, transporting, storing & protecting of in-kind and/or (d). to assert symbolic/ornamental/sacred state/religious power
The double coincidence of wants in exchange are such that it's simply too costly, due to information, complexity, agents, diversity, heterogeneous preferences, time, space, standardization, enforcement, measurement, speculation etc. to exchange without money
Production takes place in time, such that one must leverage future risk against the present by assuring a constant flow of goods in the present to account for future production
Accounting, for whatever reason, taxation, trade, production, contracts, religion, or for kicks/the hoot of it, is costly & needs to be standardized, legible, known & common
There is no necessary static TPRF and no dynamic one whatsoever
The Classical Center of gravitation does not exist
Malthusian population/demographics/commons problem doesn't apply
Land is central to economics & cannot be assumed away
Commodity/labor theories of value are problematic
The utility/demand theory of value is BS
There is, necessarily & empirically no static invisible hand, and necessarily theoretically & empirically no dynamic invisible hand
There is no Say's Law
Money, trade, credit, institutions, public goods, production etc. imply each other
Thus I will get to the firm, scale, monetary, capital & indeterminancy critiques of Marshallian economics. I will get to Walrasian & Neo-Walrasian economics & their issues. I will get to game theoretical conceptions. And I will address critiques of economics from sociology, anthropology, philosophy & psychology. The Marshallian system says that, where there are competitive firms, with defined conditions of production, then 'the scissors of supply and demand' will force the equilibrium of quantity & price. Eventually, price will converge to the long run cost of production, while fluctuations or changes demand will be accommodated by price. Thus, price & quantity adjust between these two bounds. Notably, however, is the fact that partial equilibrium of a single firm/commodity/industry is different than general--general equilibrium posits a meta-stable determinate solution to all the partial equilibriums. The failures of the Marshallian tradition come down to this: First, It would seem that anything but constant returns to scale, in the long run, implies a contradiction. The converse of this is the presence non-constant returns to scale means that either (a). the system is not in equilibrium or (b). it is not competitive Sraffa was the first to show this. Why? Because firm level returns to scale implies it would take over its industry. Industry level ones means at a certain point, economies would be infinite. The only coherent ones are returns to scale exogenous to a firm but inside an industry, like in software. Furthermore, increasing/decreasing returns come from different sources--one is found in production, the other found in distribution, making them homologous fails extraordinarily. https://edisciplinas.usp.bpluginfile.php/832648/mod_resource/content/3/The%20laws%20of%20returns%20under%20competitive%20conditions_Sraffa_1926.pdf http://www.ier.hit-u.ac.jp/~nisizawa/annalisa%20rosselli.pdf Second, The interdependence of inputs & outputs means that leaping from partial to general equilibrium is impossible. This is the famous Leontief model. In this model, due to interdependence, there is always a range of values & prices capital & consumer goods can take, given that they play into each other. Equilibrium may force a model to this range, but within that range itself, other factors like markup will play a role! http://www.nber.org/chapters/c2866.pdf https://www.jstor.org/stable/2223643?seq=1#page_scan_tab_contents https://www.usna.edu/Users/math/meh/leon.pdf
There is no scissors of supply & demand--(a). because long term labor cost is incoherent, (b). long term demand is incoherent, (c). quantity, quality, price & non-economic aspects of trade/production always exist, (d). leaping from tokens of exchange to types of laws is problematic
Youtube mp4 is one of the easiest and fastest youtube converter for downloading youtube videos to mp4. There is no registration or software needed to use the converter. The service is free. ... Just copy the YouTube video URL from youtube.com, then paste it in our converter and click „Convert”. Then the conversion will start and it should finish in just a few minutes. As soon as the conversion is finished you will be able to click the ... Converter vídeos do YouTube em mp3 sem registro. A ferramenta mais confiável e mais rápida do conversor de YouTube para MP3. Use nosso serviço para transformar qualquer vídeo em mp3. Download YouTube videos in MP3 format for free and save the converted audio file on your computer Convert and download youtube videos to mp3 (audio) or mp4 (video) files for free. There is no registration or software needed.
COMO CONVERTER PDF PARA WORD SEM PROGRAMAS - YouTube
Phil Collins, Air Supply, Elton John, Lobo, Bee Gees - Best Soft Songs 2019 - Duration: 1:26:17. Greatest Soft Rock Recommended for you Nesse vídeo veja como transcrever qualquer áudio ou vídeo para texto sem precisar utilizar nenhum programa, você pode gravar o áudio de qualquer janela do Wi... Link do atube catcher https://atube-catcher.br.uptodown.com/windows Link https://formatfactory.br.uptodown.com/windows Explicación detallada de los que es HARDWARE y SOFTWARE, ademas de como interviene el Humanware, con sus respectivos ejemplos, ademas te explico que necesita... Como converter pdf para word sem progragamas How To Convert pdf to word without software Olá Pessoa!! Ajudem -me a crescer no YouTube!!! Inscrevam-se no ca...